Sample records for forensic seismology revisited

contrast simple, comprising one or two cycles of large amplitude followed by a low-amplitude coda. Earthquake signals on the other hand were often complex with numerous arrivals of similar amplitude spread over 35 s or more. It therefore appeared that earthquakes could be recognised on complexity. Later however, complex explosion signals were observed which reduced the apparent effectiveness of complexity as a criterion for identifying earthquakes. Nevertheless, the AWE Group concluded that for many paths to teleseismic distances, Earth is transparent for P signals and this provides a window through which source differences will be most clearly seen. Much of the research by the Group has focused on understanding the influence of source type on P seismograms recorded at teleseismic distances. Consequently the paper concentrates on teleseismic methods of distinguishing between explosions and earthquakes. One of the most robust criteria for discriminating between earthquakes and explosions is the m b : M s criterion which compares the amplitudes of the SP P waves as measured by the body-wave magnitude m b, and the long-period (LP: ˜0.05 Hz) Rayleigh-wave amplitude as measured by the surface-wave magnitude M s; the P and Rayleigh waves being the main wave types used in forensicseismology. For a given M s, the m b for explosions is larger than for most earthquakes. The criterion is difficult to apply however, at low magnitude (say m b fail. Consequently the AWE Group in cooperation with the University of Cambridge used seismogram modelling to try and understand what controls complexity of SP P seismograms, and to put the m b : M s criterion on a theoretical basis. The results of this work show that the m b : M s criterion is robust because several factors contribute to the separation of earthquakes and explosions. The principal reason for the separation however, is that for many orientations of the earthquake source there is at least one P nodal plane in the teleseismic

The Norwegian Seismic Array (NORSAR) has in 1979 worked mainly on reports and investigations for the seismological expert group established in 1976 by the UN Disarmament Committee in Geneva. One of NORSAR's staff is scientific secretary for the group. Reports published by the group in 1978 and 1979 proposed a global surveillance system for nuclear explosions and NORSAR as one of the largest stations will play a central role in the proposed network. A number of other tasks have been performed by NORSAR in connection with the seismology and tectonics of the Norwegian continental shelf, a projected dam in Tanzania, a dam in S.W.Norway, seismic activityin Spitzbergen and ore prospecting in N.Norway. (JIW)

Seismology is a highly effective tool for investigating the internal structure of the Earth. Similar techniques have also successfully been used to study other planetary bodies (planetary seismology), the Sun (helioseismology), and other stars (asteroseismology). Despite obvious differences between stars and planetary bodies, these disciplines share many similarities and together form a coherent field of scientific research. This unique book takes a transdisciplinary approach to seismology and seismic imaging, reviewing the most recent developments in these extraterrestrial contexts. With contributions from leading scientists, this timely volume systematically outlines the techniques used in observation, data processing, and modelling for asteroseismology, helioseismology, and planetary seismology, drawing comparisons with seismic methods used in geophysics. Important recent discoveries in each discipline are presented. With an emphasis on transcending the traditional boundaries of astronomy, solar, planetary...

Forensic chemistry is unique among chemical sciences in that its research, practice, and presentation must meet the needs of both the scientific and the legal communities. As such, forensic chemistry research is applied and derivative by nature and design, and it emphasizes metrology (the science of measurement) and validation. Forensic chemistry has moved away from its analytical roots and is incorporating a broader spectrum of chemical sciences. Existing forensic practices are being revisited as the purview of forensic chemistry extends outward from drug analysis and toxicology into such diverse areas as combustion chemistry, materials science, and pattern evidence.

Rotational seismology is an emerging study of all aspects of rotational motions induced by earthquakes, explosions, and ambient vibrations. It is of interest to several disciplines, including seismology, earthquake engineering, geodesy, and earth-based detection of Einstein’s gravitation waves.Rotational effects of seismic waves, together with rotations caused by soil–structure interaction, have been observed for centuries (e.g., rotated chimneys, monuments, and tombstones). Figure 1a shows the rotated monument to George Inglis observed after the 1897 Great Shillong earthquake. This monument had the form of an obelisk rising over 19 metres high from a 4 metre base. During the earthquake, the top part broke off and the remnant of some 6 metres rotated about 15° relative to the base. The study of rotational seismology began only recently when sensitive rotational sensors became available due to advances in aeronautical and astronomical instrumentations.

This report sumarises the R and D in Seismology during the period from January 1986 to December 1987. Major topics of current study are (1) ForensicSeismology, (2) Seismicity and Seismic Risk estimates, (3) Reservoir induced seismicity and (4) Rockburst monitoring. Considerable effort is devoted to development of seismic data acquisition systems and theoretical aspects of seismology. (author)

In science, projects which involve volunteers for observations, measurements, computation are grouped under the term, Citizen Science. They range from bird or planet census to distributing computing on volonteers's computer. Over the last five years, the EMSC has been developing tools and strategy to collect information on earthquake's impact from the first persons to be informed, i.e. the witnesses. By extension, it is named Citizen Seismology. The European Mediterranean Seismological Centre (EMSC), a scientific not-for-profit NGO, benefits from the high visibility of its rapid earthquake information services (www.emsc-csem.org) which attract an average of more than half a million visits a month from 160 countries. Witnesses converge to its site within a couple of minutes of earthquake's occurrence to find out information about the cause of the shaking they have just been through. The convergence generates brutal increases of hit rate which can be automatically detected. They are often the first indication about the occurrence of a felt event. Witnesses' locations are determined from their IP addresses. Localities exhibiting statistically significant increase of traffic are mapped to produce the "felt map". This map available within 5 to 8 minutes of the earthquake's occurrence represents the area where the event was felt. It is the fastest way to collect in-situ information on the consequences of an earthquake. Widespread damage region are expected to be mapped through a significant lack or absence of visitors. A second tool involving the visitors is an online macroseismic questionnaire available in 21 languages. It complements the felt maps as it can describes the level of shaking or damage, but is only available in 90 to 120 minutes. Witnesses can also share their pictures of damage. They used it also to provide us exceptional pictures of transient phenomena. With the University of Edinburgh, we are finalising a prototype named ShakemApple, linking Apple

A fundamental goal of volcano seismology is to understand active magmatic systems, to characterize the configuration of such systems, and to determine the extent and evolution of source regions of magmatic energy. Such understanding is critical to our assessment of eruptive behavior and its hazardous impacts. With the emergence of portable broadband seismic instrumentation, availability of digital networks with wide dynamic range, and development of new powerful analysis techniques, rapid progress is being made toward a synthesis of high-quality seismic data to develop a coherent model of eruption mechanics. Examples of recent advances are: (1) high-resolution tomography to image subsurface volcanic structures at scales of a few hundred meters; (2) use of small-aperture seismic antennas to map the spatio-temporal properties of long-period (LP) seismicity; (3) moment tensor inversions of very-long-period (VLP) data to derive the source geometry and mass-transport budget of magmatic fluids; (4) spectral analyses of LP events to determine the acoustic properties of magmatic and associated hydrothermal fluids; and (5) experimental modeling of the source dynamics of volcanic tremor. These promising advances provide new insights into the mechanical properties of volcanic fluids and subvolcanic mass-transport dynamics. As new seismic methods refine our understanding of seismic sources, and geochemical methods better constrain mass balance and magma behavior, we face new challenges in elucidating the physico-chemical processes that cause volcanic unrest and its seismic and gas-discharge manifestations. Much work remains to be done toward a synthesis of seismological, geochemical, and petrological observations into an integrated model of volcanic behavior. Future important goals must include: (1) interpreting the key types of magma movement, degassing and boiling events that produce characteristic seismic phenomena; (2) characterizing multiphase fluids in subvolcanic

were captured when they described entrepreneurs. Therefore, this paper aims to revisit gender role stereotypes among young adults. Design/methodology/approach: To measure stereotyping, participants were asked to describe entrepreneurs in general and either women or men in general. The Schein......Purpose: Entrepreneurship is shaped by a male norm, which has been widely demonstrated in qualitative studies. The authors strive to complement these methods by a quantitative approach. First, gender role stereotypes were measured in entrepreneurship. Second, the explicit notions of participants......: The images of men and entrepreneurs show a high and significant congruence (r = 0.803), mostly in those adjectives that are untypical for men and entrepreneurs. The congruence of women and entrepreneurs was low (r = 0.152) and insignificant. Contrary to the participants’ beliefs, their explicit notions did...

The structure and diagnostic properties of the spectrum of free oscillations of the models of the Jupiter are discussed. The spectrum is very sensitive to the properties of the inner core and density discontinuities in the interior of the planet. It is shown that in seismology of the Jupiter unlike to solar seismology, it is not possible to use the asymptotic theory for investigation of the high-frequency part of the acoustic spectrum

This report summarises the research and development activities of the Seismology Section during the periods from January 1988 to December 1989. Apart from the ongoing work on forensicseismology, seismicity studies, rock burst monitoring, elastic wave propagation, a new field system became operational at Bhatsa, located about 100 km from Bombay, comprising 11 station radio-telemetered seismic network with a central recording laboratory to study the reservoir induced seismicity. (author). figs., tabs

Jesuits have been involved with scientific endeavors since the 16th century, although their association with seismology is more recent. What impelled Jesuit priests to also become seismologists is am matter of conjecture. Certainly the migration of missionaries to various parts of the world must have resulted in queries to their fellow Jesuits in Europe. What caused earthquakes? Could they be predicted? Were they connected with the weather?

We use controlled noise seismology (CNS) to generate surface waves, where we continuously record seismic data while generating artificial noise along the profile line. To generate the CNS data we drove a vehicle around the geophone line and continuously recorded the generated noise. The recorded data set is then correlated over different time windows and the correlograms are stacked together to generate the surface waves. The virtual shot gathers reveal surface waves with moveout velocities that closely approximate those from active source shot gathers.

The latest seismological equipment and data processing instrumentation installed at the Colombia Seismological Network (RSNC) are described. System configuration, network operation, and data management are discussed. The data quality and the new seismological products are analyzed. The main purpose of the network is to monitor local seismicity with a special emphasis on seismic activity surrounding the Colombian Pacific and Caribbean oceans, for early warning in case a Tsunami is produced by an earthquake. The Colombian territory is located at the South America northwestern corner, here three tectonic plates converge: Nazca, Caribbean and the South American. The dynamics of these plates, when resulting in earthquakes, is continuously monitored by the network. In 2012, the RSNC registered in 2012 an average of 67 events per day; from this number, a mean of 36 earthquakes were possible to be located well. In 2010 the network was also able to register an average of 67 events, but it was only possible to locate a mean of 28 earthquakes daily. This difference is due to the expansion of the network. The network is made up of 84 stations equipped with different kind of broadband 40s, 120s seismometers, accelerometers and short period 1s sensors. The signal is transmitted continuously in real-time to the Central Recording Center located at Bogotá, using satellite, telemetry, and Internet. Moreover, there are some other stations which are required to collect the information in situ. Data is recorded and processed digitally using two different systems, EARTHWORM and SEISAN, which are able to process and share the information between them. The RSNC has designed and implemented a web system to share the seismological data. This innovative system uses tools like Java Script, Oracle and programming languages like PHP to allow the users to access the seismicity registered by the network almost in real time as well as to download the waveform and technical details. The coverage

We use controlled noise seismology (CNS) to generate surface waves, where we continuously record seismic data while generating artificial noise along the profile line. To generate the CNS data we drove a vehicle around the geophone line and continuously recorded the generated noise. The recorded data set is then correlated over different time windows and the correlograms are stacked together to generate the surface waves. The virtual shot gathers reveal surface waves with moveout velocities that closely approximate those from active source shot gathers.

For the construction of nuclear power stations, comprehensive site investigations are required to assure the adequacy and suitability of the site under consideration, as well as to establish the basic design data for designing and building the plant. The site investigations cover mainly the following matters: geology, seismology, hydrology, meteorology. Site investigations for nuclear power stations are carried out in stages in increasing detail and to an appreciable depth in order to assure the soundness of the project, and, in particular, to determine all measures required to assure the safety of the nuclear power station and the protection of the population against radiation exposure. The aim of seismological investigations is to determine the strength of the vibratory ground motion caused by an expected strong earthquake in order to design the plant resistant enough to take up these vibrations. In addition, secondary effects of earthquakes, such as landslides, liquefaction, surface faulting, etc. must be studied. For seashore sites, the tsunami risk must be evaluated. (orig.)

An important project was carried out in Bucharest area by the National Institute of Research-Development for Earth Physics and Collaborative Research Center 461 (CRC 461) Geophysical Institute from the University of Karlsruhe (Germany) in the period October 2003 - August 2004. The project consists of an array of 33 stations, uniformly arranged in the city of Bucharest and in the outskirts (Magurele, Voluntari, Otopeni, Buftea, etc). The station functioned 24 h/day for a period of 10 months. The number of functioning stations had a little variation in time, some of them had to be moved because some sites became improper in time. The sensors used by the stations were of the type: STS - 2, LE - 3D, 4OT, 3ESP and KS2000. The performance of continuous recording was possible by using on each station a hard disk drive of 120 Gb, which gives independence of 3 month. For preventing some accidental electric power stops a rechargeable battery on each station was used . A service was performed for each station every month to avoid accidental stops, which consisted usually of mechanical bumps. All the recorded data by the stations was saved on DVSs, the final number being around 140. This project helped gathering a large number of seismological data for the city of Bucharest and outskirts from seismic events of magnitude of 4, 3, 2 and ambient noise. (authors)

Despite residing in a state with 75% of North American earthquakes and three of the top 15 ever recorded, most Alaskans have limited knowledge about the science of earthquakes. To many, earthquakes are just part of everyday life, and to others, they are barely noticed until a large event happens, and often ignored even then. Alaskans are rugged, resilient people with both strong independence and tight community bonds. Rural villages in Alaska, most of which are inaccessible by road, are underrepresented in outreach efforts. Their remote locations and difficulty of access make outreach fiscally challenging. Teacher retention and small student bodies limit exposure to science and hinder student success in college. The arrival of EarthScope's Transportable Array, the 50th anniversary of the Great Alaska Earthquake, targeted projects with large outreach components, and increased community interest in earthquake knowledge have provided opportunities to spread information across Alaska. We have found that performing hands-on demonstrations, identifying seismological relevance toward career opportunities in Alaska (such as natural resource exploration), and engaging residents through place-based experience have increased the public's interest and awareness of our active home.

Earth is an open thermodynamic system radiating heat energy into space. A transition from geostatic earth models such as PREM to geodynamical models is needed. We discuss possible thermodynamic constraints on the variables that govern the distribution of forces and flows in the deep Earth. In this paper we assume that the temperature distribution is time-invariant, so that all flows vanish at steady state except for the heat flow Jq per unit area (Kuiken, 1994). Superscript 0 will refer to the steady state while x denotes the excited state of the system. We may write σ 0=(J{q}0ṡX{q}0)/T where Xq is the conjugate force corresponding to Jq, and σ is the rate of entropy production per unit volume. Consider now what happens after the occurrence of an earthquake at time t=0 and location (0,0,0). The earthquake introduces a stress drop Δ P(x,y,z) at all points of the system. Response flows are directed along the gradients toward the epicentral area, and the entropy production will increase with time as (Prigogine, 1947) σ x(t)=σ 0+α {1}/(t+β )+α {2}/(t+β )2+etc A seismological constraint on the parameters may be obtained from Omori's empirical relation N(t)=p/(t+q) where N(t) is the number of aftershocks at time t following the main shock. It may be assumed that p/q\\sim\\alpha_{1}/\\beta times a constant. Another useful constraint is the Mexican-hat geometry of the seismic transient as obtained e.g. from InSAR radar interferometry. For strike-slip events such as Landers the distribution of \\DeltaP is quadrantal, and an oval-shaped seismicity gap develops about the epicenter. A weak outer triggering maxiμm is found at a distance of about 17 fault lengths. Such patterns may be extracted from earthquake catalogs by statistical analysis (Lomnitz, 1996). Finally, the energy of the perturbation must be at least equal to the recovery energy. The total energy expended in an aftershock sequence can be found approximately by integrating the local contribution over

The principle methods of seismologic data treatment with application in engineering design, emphasizing the need for the utilization of reliable data, appropriate algorithims and rigorous calculations so that correct results and valid conclusions be achieved, are examined. (E.G.) [pt

At the beginning of the 1970's, a series of programs in seismology were initiated by different Costa Rican institutions, and some of these programs are still in the process of development. The institutions are Insituto Costaricense de Electricidad (ICE)- The Costa Rica Institute of Electricity

Forensic odontology is a specialized field of dentistry which analyses dental evidence in the interest of justice. Forensic odontology embraces all dental specialities and it is almost impossible to segregate this branch from other dental specialities. This review aims to discuss the utility of various dental specialities with forensic odontology.

The increase in the number of forensic genetic loci used for identification purposes results in infinitesimal random match probabilities. These probabilities are computed under assumptions made for rather simple population genetic models. Often, the forensic expert reports likelihood ratios, where...... published results accounting for close familial relationships. However, we revisit the discussion to increase the awareness among forensic genetic practitioners and include new information on medical and societal factors to assess the risk of not considering a monozygotic twin as the true perpetrator......, then data relevant for the Danish society suggests that the threshold of likelihood ratios should approximately be between 150,000 and 2,000,000 in order to take the risk of an unrecognised identical, monozygotic twin into consideration. In other societies, the threshold of the likelihood ratio in crime...

Over the last few decades the seismological monitoring systems have dramatically improved tanks to the technological advancements and to the scientific progresses of the seismological studies. The most modern processing systems use the network tech- nologies to realize high quality performances in data transmission and remote controls. Their architecture is designed to favor the real-time signals analysis. This is, usually, realized by adopting a modular structure that allow to easy integrate any new cal- culation algorithm, without affecting the other system functionalities. A further step in the seismic processing systems evolution is the large use of the web based appli- cations. The web technologies can be an useful support for the monitoring activities allowing to automatically publishing the results of signals processing and favoring the remote access to data, software systems and instrumentation. An application of the web technologies to the seismological monitoring has been developed at the "Os- servatorio Vesuviano" monitoring center (INGV) in collaboration with the "Diparti- mento di Informatica e Sistemistica" of the Naples University. A system named Web Based Seismological Monitoring (WBSM) has been developed. Its main objective is to automatically publish the seismic events processing results and to allow displaying, analyzing and downloading seismic data via Internet. WBSM uses the XML tech- nology for hypocentral and picking parameters representation and creates a seismic events data base containing parametric data and wave-forms. In order to give tools for the evaluation of the quality and reliability of the published locations, WBSM also supplies all the quality parameters calculated by the locating program and allow to interactively display the wave-forms and the related parameters. WBSM is a modular system in which the interface function to the data sources is performed by two spe- cific modules so that to make it working in conjunction with a

The term forensic science may evoke thoughts of blood-spatter analysis, DNA testing, and identifying molds, spores, and larvae. A growing part of this field, however, is that of digital forensics, involving techniques with clear connections to math and physics. This article describes a five-part project involving smartphones and the investigation…

Modern scientific technology now plays an increasingly important role in the process of law enforcement. Neutron activation, as developed for elemental analysis offers, in many cases, the suitable answer to forensic problems. The author discusses the use NAA has been put to in forensic science. (Auth.)

Nuclear forensics is the analysis of nuclear materials recovered from either the capture of unused materials, or from the radioactive debris following a nuclear explosion and can contribute significantly to the identification of the sources of the materials and the industrial processes used to obtain them. In the case of an explosion, nuclear forensics can also reconstruct key features of the nuclear device. Nuclear forensic analysis works best in conjunction with other law enforcement, radiological protection dosimetry, traditional forensics, and intelligence work to provide the basis for attributing the materials and/or nuclear device to its originators. Nuclear forensics is a piece of the overall attribution process, not a stand-alone activity

This review discusses the methodology of nuclear forensics and illicit trafficking of nuclear materials. Nuclear forensics is relatively new scientific branch whose aim it is to read out material inherent from nuclear material. Nuclear forensics investigations have to be considered as part of a comprehensive set of measures for detection,interception, categorization and characterization of illicitly trafficking nuclear material. Prevention, detection and response are the main elements in combating illicit trafficking. Forensics is a key element in the response process. Forensic science is defined as the application of a broad spectrum of sciences to answer questions of interest to the legal system. Besides, in this study we will explain age determination of nuclear materials.

Our recent textbook, Introduction to Seismology, Earthquakes, & Earth Structure (Blackwell, 2003) is used in many countries. Part of the reason for this may be our deliberate attempt to write the book for an international audience. This effort appears in several ways. We stress seismology's long tradition of global data interchange. Our brief discussions of the science's history illustrate the contributions of scientists around the world. Perhaps most importantly, our discussions of earthquakes, tectonics, and seismic hazards take a global view. Many examples are from North America, whereas others are from other areas. Our view is that non-North American students should be exposed to North American examples that are type examples, and that North American students should be similarly exposed to examples elsewhere. For example, we illustrate how the Euler vector geometry changes a plate boundary from spreading, to strike-slip, to convergence using both the Pacific-North America boundary from the Gulf of California to Alaska and the Eurasia-Africa boundary from the Azores to the Mediterranean. We illustrate diffuse plate boundary zones using western North America, the Andes, the Himalayas, the Mediterranean, and the East Africa Rift. The subduction zone discussions examine Japan, Tonga, and Chile. We discuss significant earthquakes both in the U.S. and elsewhere, and explore hazard mitigation issues in different contexts. Both comments from foreign colleagues and our experience lecturing overseas indicate that this approach works well. Beyond the specifics of our text, we believe that such a global approach is facilitated by the international traditions of the earth sciences and the world youth culture that gives students worldwide common culture. For example, a video of the scene in New Madrid, Missouri that arose from a nonsensical earthquake prediction in 1990 elicits similar responses from American and European students.

Full Text Available The paper is a review of different issues that a forensic psychologists encounter at work. Forensic assessment might be needed in civil law cases, administrative procedures and in criminal law cases. The paper focuses on referrals in criminal law cases regarding matters such as assessing competence to stand trial, criminal responsibility and violence risk assessment. Finally, the role of expert testimony on eyewitness memory, which is not used in practice in Slovenia yet, is presented.

The potential of space-based geodetic measurement of crustal deformation in the context of seismology is explored. The achievements of seismological source theory and data analyses, mechanical modeling of fault zone behavior, and advances in space-based geodesy are reviewed, with emphasis on realizable contributions of space-based geodetic measurements specifically to seismology. The fundamental relationships between crustal deformation associated with an earthquake and the geodetically observable data are summarized. The response and spatial and temporal resolution of the geodetic data necessary to understand deformation at various phases of the earthquake cycle is stressed. The use of VLBI, SLR, and GPS measurements for studying global geodynamics properties that can be investigated to some extent with seismic data is discussed. The potential contributions of continuously operating strain monitoring networks and globally distributed geodetic observatories to existing worldwide modern digital seismographic networks are evaluated in reference to mutually addressable problems in seismology, geophysics, and tectonics.

The Bulgarian National Digital Seismological Network (BNDSN) consists of a National Data Center (NDC), 13 stations equipped with RefTek High Resolution Broadband Seismic Recorders - model DAS 130-01/3, 1 station equipped with Quanterra 680 and broadband sensors and accelerometers. Real-time data transfer from seismic stations to NDC is realized via Virtual Private Network of the Bulgarian Telecommunication Company. The communication interruptions don't cause any data loss at the NDC. The data are backed up in the field station recorder's 4Mb RAM memory and are retransmitted to the NDC immediately after the communication link is re-established. The recorders are equipped with 2 compact flash disks able to save more than 1 month long data. The data from the flash disks can be downloaded remotely using FTP. The data acquisition and processing hardware redundancy at the NDC is achieved by two clustered SUN servers and two Blade Workstations. To secure the acquisition, processing and data storage processes a three layer local network is designed at the NDC. Real-time data acquisition is performed using REFTEK's full duplex error-correction protocol RTPD. Data from the Quanterra recorder and foreign stations are fed into RTPD in real-time via SeisComP/SeedLink protocol. Using SeisComP/SeedLink software the NDC transfers real-time data to INGV-Roma, NEIC-USA, ORFEUS Data Center. Regional real-time data exchange with Romania, Macedonia, Serbia and Greece is established at the NDC also. Data processing is performed by the Seismic Network Data Processor (SNDP) software package running on the both Servers. SNDP includes subsystems: Real-time subsystem (RTS_SNDP) - for signal detection; evaluation of the signal parameters; phase identification and association; source estimation; Seismic analysis subsystem (SAS_SNDP) - for interactive data processing; Early warning subsystem (EWS_SNDP) - based on the first arrived P-phases. The signal detection process is performed by

The scientific, technical, and economic importance of the issues discussed here presents a clear agenda for future research in computational seismology. In this way these problems will drive advances in high-performance computing in the field of seismology. There is a broad community that will benefit from this work, including the petroleum industry, research geophysicists, engineers concerned with seismic hazard mitigation, and governments charged with enforcing a comprehensive test ban treaty. These advances may also lead to new applications for seismological research. The recent application of high-resolution seismic imaging of the shallow subsurface for the environmental remediation industry is an example of this activity. This report makes the following recommendations: (1) focused efforts to develop validated documented software for seismological computations should be supported, with special emphasis on scalable algorithms for parallel processors; (2) the education of seismologists in high-performance computing technologies and methodologies should be improved; (3) collaborations between seismologists and computational scientists and engineers should be increased; (4) the infrastructure for archiving, disseminating, and processing large volumes of seismological data should be improved.

While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and

Odour of the animal or human corpses immediately after death is very attractive for insects and other invertebrates. Blue and green bottle flies from the Calliphoridae family are the first colonizers of cadaver and immediately later necrophagous Diptera from the Sarcophagidae family settle on the same corpse. It is essential to determine the time past after death for elucidating the event in case of the homicide or suspicious death, and it is directly proportional to the post mortem interval expected time, which is based upon the speed of the larval growth. In this article, we purposed to stress the special interest of forensic entomology for the scientists who will apply this science in their forensic researches and case studies, and also to provide information to our judges, prosecutors and law enforcement agents in order to consider the entomological samples to be reliable and applicable evidences as biological stains and hairs. We are of the opinion that if any forensic entomologist is called to the crime scene or if the evidences are collected and then delivered to an entomologist, the forensic cases will be elucidated faster and more accurately.

Full Text Available In this paper we posit that current investigative techniques—particularly as deployed by law enforcement, are becoming unsuitable for most types of crime investigation. The growth in cybercrime and the complexities of the types of the cybercrime coupled with the limitations in time and resources, both computational and human, in addressing cybercrime put an increasing strain on the ability of digital investigators to apply the processes of digital forensics and digital investigations to obtain timely results. In order to combat the problems, there is a need to enhance the use of the resources available and move beyond the capabilities and constraints of the forensic tools that are in current use. We argue that more intelligent techniques are necessary and should be used proactively. The paper makes the case for the need for such tools and techniques, and investigates and discusses the opportunities afforded by applying principles and procedures of artificial intelligence to digital forensics intelligence and to intelligent forensics and suggests that by applying new techniques to digital investigations there is the opportunity to address the challenges of the larger and more complex domains in which cybercrimes are taking place.

Reports the state-of-the-art in seismology and earthquake engineering that is being advanced in Central and South America. Provides basic information on seismological station locations in Latin America and some of the programmes in strong-motion seismology, as well as some of the organizations involved in these activities.-from Author

Full Text Available This paper discusses the development of a South African model for Live Forensic Acquisition - Liforac. The Liforac model is a comprehensive model that presents a range of aspects related to Live Forensic Acquisition. The model provides forensic...

Forensic anthropology is the application of biological or physical anthropology in the service of justice. One main area is the analysis of human remains. Such analyses involve person identification by assessment of age and sex of the deceased, and comparison with ante-mortem data. Another major area is the analysis of surveillance pictures and videos. Such analyses may comprise facial and bodily morphological comparisons, multi-angle photogrammetry and gait analysis. We also perform studies of human remains for archaeologists.

Full Text Available A wide range of satellite methods is applied now in seismology. The first applications of satellite data for earthquake exploration were initiated in the ‘70s, when active faults were mapped on satellite images. It was a pure and simple extrapolation of airphoto geological interpretation methods into space. The modern embodiment of this method is alignment analysis. Time series of alignments on the Earth's surface are investigated before and after the earthquake. A further application of satellite data in seismology is related with geophysical methods. Electromagnetic methods have about the same long history of application for seismology. Stable statistical estimations of ionosphere-lithosphere relation were obtained based on satellite ionozonds. The most successful current project "DEMETER" shows impressive results. Satellite thermal infra-red data were applied for earthquake research in the next step. Numerous results have confirmed previous observations of thermal anomalies on the Earth's surface prior to earthquakes. A modern trend is the application of the outgoing long-wave radiation for earthquake research. In ‘80s a new technology—satellite radar interferometry—opened a new page. Spectacular pictures of co-seismic deformations were presented. Current researches are moving in the direction of pre-earthquake deformation detection. GPS technology is also widely used in seismology both for ionosphere sounding and for ground movement detection. Satellite gravimetry has demonstrated its first very impressive results on the example of the catastrophic Indonesian earthquake in 2004. Relatively new applications of remote sensing for seismology as atmospheric sounding, gas observations, and cloud analysis are considered as possible candidates for applications.

The recent popularity of research on topics of multimedia forensics justifies reflections on the definition of the field. This paper devises an ontology that structures forensic disciplines by their primary domain of evidence. In this sense, both multimedia forensics and computer forensics belong to the class of digital forensics, but they differ notably in the underlying observer model that defines the forensic investigator’s view on (parts of) reality, which itself is not fully cognizable. Important consequences on the reliability of probative facts emerge with regard to available counter-forensic techniques: while perfect concealment of traces is possible for computer forensics, this level of certainty cannot be expected for manipulations of sensor data. We cite concrete examples and refer to established techniques to support our arguments.

Insects are used in a variety of ways in forensic science and the developing area of forensic acarology may have a similar range of potential. This short account summarises the main ways in which entomology currently contributes to forensic science and discusses to what extent acarology might also contribute in these areas.

We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

Necrophagous insects are important in the decomposition of cadavers. The close association between insects and corpses and the use of insects in medicocriminal investigations is the subject of forensic entomology. The present paper reviews the historical background of this discipline, important postmortem processes, and discusses the scientific basis underlying attempts to determine the time interval since death. Using medical techniques, such as the measurement of body temperature or analysing livor and rigor mortis, time since death can only be accurately measured for the first two or three days after death. In contrast, by calculating the age of immature insect stages feeding on a corpse and analysing the necrophagous species present, postmortem intervals from the first day to several weeks can be estimated. These entomological methods may be hampered by difficulties associated with species identification, but modern DNA techniques are contributing to the rapid and authoritative identification of necrophagous insects. Other uses of entomological data include the toxicological examination of necrophagous larvae from a corpse to identify and estimate drugs and toxicants ingested by the person when alive and the proof of possible postmortem manipulations. Forensic entomology may even help in investigations dealing with people who are alive but in need of care, by revealing information about cases of neglect.

Strategic Decision Analysis (SDA) is the evolving body of knowledge on how to achieve high quality in the decision that shapes an organization's future. SDA comprises philosophy, process concepts, methodology, and tools for making good decisions. It specifically incorporates many concepts and tools from economic evaluation and risk analysis. Chevron Petroleum Technology Company (CPTC) has applied SDA to evaluate and prioritize a number of its most important and most uncertain R and D projects, including borehole seismology. Before SDA, there were significant issues and concerns about the value to CPTC of continuing to work on borehole seismology. The SDA process created a cross-functional team of experts to structure and evaluate this project. A credible economic model was developed, discrete risks and continuous uncertainties were assessed, and an extensive sensitivity analysis was performed. The results, even applied to a very restricted drilling program for a few years, were good enough to demonstrate the value of continuing the project. This paper explains the SDA philosophy concepts, and process and demonstrates the methodology and tools using the borehole seismology project example. SDA is useful in the upstream industry not just in the R and D/technology decisions, but also in major exploration and production decisions. Since a major challenge for upstream companies today is to create and realize value, the SDA approach should have a very broad applicability

Revisits and reviews Imre Lakatos' ideas on "Falsification and the Methodology of Scientific Research Programmes." Suggests that Lakatos' framework offers an insightful way of looking at the relationship between theory and research that is relevant not only for evaluating research programs in theoretical physics, but in the social…

Introduction A successful workshop titled 'Measuring the Rotation Effects of Strong Ground Motion' was held simultaneously in Menlo Park and Pasadena via video conference on 16 February 2006. The purpose of the Workshop and this Report are to summarize existing data and theory and to explore future challenges for rotational seismology, including free-field strong motion, structural strong motion, and teleseismic motions. We also forged a consensus on the plan of work to be pursued by this international group in the near term. At this first workshop were 16 participants in Menlo Park, 13 in Pasadena, and a few on the telephone. It was organized by William H. K. Lee and John R. Evans and chaired by William U. Savage in Menlo Park and by Kenneth W. Hudnut in Pasadena. Its agenda is given in the Appendix. This workshop and efforts in Europe led to the creation of the International Working Group on Rotational Seismology (IWGoRS), an international volunteer group providing forums for exchange of ideas and data as well as hosting a series of Workshops and Special Sessions. IWGoRS created a Web site, backed by an FTP site, for distribution of materials related to rotational seismology. At present, the FTP site contains the 2006 Workshop agenda (also given in the Appendix below) and its PowerPoint presentations, as well as many papers (reasonable-only basis with permission of their authors), a comprehensive citations list, and related information. Eventually, the Web site will become the sole authoritative source for IWGoRS and shared information: http://www.rotational-seismology.org ftp://ehzftp.wr.usgs.gov/jrevans/IWGoRS_FTPsite/ With contributions from various authors during and after the 2006 Workshop, this Report proceeds from the theoretical bases for making rotational measurements (Graizer, Safak, Trifunac) through the available observations (Huang, Lee, Liu, Nigbor), proposed suites of measurements (Hudnut), a discussion of broadband teleseismic rotational

The book is an easy-to-follow guide with clear instructions on various mobile forensic techniques. The chapters and the topics within are structured for a smooth learning curve, which will swiftly empower you to master mobile forensics. If you are a budding forensic analyst, consultant, engineer, or a forensic professional wanting to expand your skillset, this is the book for you. The book will also be beneficial to those with an interest in mobile forensics or wanting to find data lost on mobile devices. It will be helpful to be familiar with forensics in general but no prior experience is re

The construction of seismological community services for the European Plate Observing System Research Infrastructure (EPOS) is by now well under way. A significant number of services are already operational, largely based on those existing at established institutions or collaborations like ORFEUS, EMSC, AHEAD and EFEHR, and more are being added to be ready for internal validation by late 2017. In this presentation we focus on a number of issues related to the interaction of the community of users with the services provided by the seismological part of the EPOS research infrastructure. How users interact with a service (and how satisfied they are with this interaction) is viewed as one important component of the validation of a service within EPOS, and certainly is key to the uptake of a service and from that also it's attributed value. Within EPOS Seismology, the following aspects of user interaction have already surfaced: - user identification (and potential tracking) versus ease-of-access and openness Requesting users to identify themselves when accessing a service provides various advantages to providers and users (e.g. quantifying & qualifying the service use, customization of services and interfaces, handling access rights and quotas), but may impact the ease of access and also shy away users who don't wish to be identified for whatever reason. - service availability versus cost There is a clear and prominent connection between the availability of a service, both regarding uptime and capacity, and its operational cost (IT systems and personnel), and it is often not clear where to draw the line (and based on which considerations). In connection to that, how to best utilize third-party IT infrastructures (either commercial or public), and what the long-term cost implications of that might be, is equally open. - licensing and attribution The issue of intellectual property and associated licensing policies for data, products and services is only recently gaining

After the successful completion of the EPOS Preparatory Phase, the community of European Research Infrastructures in Seismology is now moving ahead with the build-up of the Thematic Core Service (TCS) for Seismology in EPOS, EPOS-Seismology. Seismology is a domain where European-level infrastructures have been developed since decades, often supported by large-scale EU projects. Today these infrastructures provide services to access earthquake waveforms (ORFEUS), parameters (EMSC) and hazard data and products (EFEHR). The existing organizations constitute the backbone of infrastructures that also in future will continue to manage and host the services of the TCS EPOS-Seismology. While the governance and internal structure of these organizations will remain active, and continue to provide direct interaction with the community, EPOS-Seismology will provide the integration of these within EPOS. The main challenge in the build-up of the TCS EPOS-Seismology is to improve and extend these existing services, producing a single framework which is technically, organizationally and financially integrated with the EPOS architecture, and to further engage various kinds of end users (e.g. scientists, engineers, public managers, citizen scientists). On the technical side the focus lies on four major tasks: - the construction of the next generation software architecture for the European Integrated (waveform) Data Archive EIDA, developing advanced metadata and station information services, fully integrate strong motion waveforms and derived parametric engineering-domain data, and advancing the integration of mobile (temporary) networks and OBS deployments in EIDA; - the further development and expansion of services to access seismological products of scientific interest as provided by the community by implementing a common collection and development (IT) platform, improvements in the earthquake information services e.g. by introducing more robust quality indicators and diversifying

Explains the implementation of forensic science in an integrated curriculum and discusses the advantages of this approach. Lists the forensic science course syllabi studied in three high schools. Discusses the unit on polymers in detail. (YDS)

The aim of forensic speaker recognition is to establish links between individuals and criminal activities, through audio speech recordings. This field is multidisciplinary, combining predominantly phonetics, linguistics, speech signal processing, and forensic statistics. On these bases, expert-based

If you are a forensic analyst or an information security professional wanting to develop your knowledge of Android forensics, then this is the book for you. Some basic knowledge of the Android mobile platform is expected.

This paper reports advances in seismic waveform description and discovery leading to a new seismological service and presents the key steps in its design, implementation and adoption. This service, named WFCatalog, which stands for waveform catalogue, accommodates features of seismological waveform data. Therefore, it meets the need for seismologists to be able to select waveform data based on seismic waveform features as well as sensor geolocations and temporal specifications. We describe the collaborative design methods and the technical solution showing the central role of seismic feature catalogues in framing the technical and operational delivery of the new service. Also, we provide an overview of the complex environment wherein this endeavour is scoped and the related challenges discussed. As multi-disciplinary, multi-organisational and global collaboration is necessary to address today's challenges, canonical representations can provide a focus for collaboration and conceptual tools for agreeing directions. Such collaborations can be fostered and formalised by rallying intellectual effort into the design of novel scientific catalogues and the services that support them. This work offers an example of the benefits generated by involving cross-disciplinary skills (e.g. data and domain expertise) from the early stages of design, and by sustaining the engagement with the target community throughout the delivery and deployment process.

Ice-covered ocean worlds possess diverse energy sources and associated mechanisms that are capable of driving significant seismic activity, but to date no measurements of their seismic activity have been obtained. Such investigations could reveal the transport properties and radial structures, with possibilities for locating and characterizing trapped liquids that may host life and yielding critical constraints on redox fluxes and thus on habitability. Modeling efforts have examined seismic sources from tectonic fracturing and impacts. Here, we describe other possible seismic sources, their associations with science questions constraining habitability, and the feasibility of implementing such investigations. We argue, by analogy with the Moon, that detectable seismic activity should occur frequently on tidally flexed ocean worlds. Their ices fracture more easily than rocks and dissipate more tidal energy than the worlds also should create less thermal noise due to their greater distance and consequently smaller diurnal temperature variations. They also lack substantial atmospheres (except in the case of Titan) that would create additional noise. Thus, seismic experiments could be less complex and less susceptible to noise than prior or planned planetary seismology investigations of the Moon or Mars. Key Words: Seismology-Redox-Ocean worlds-Europa-Ice-Hydrothermal. Astrobiology 18, 37-53.

We critique and extend theory on organizational sensemaking around three themes. First, we investigate sense arising non-productively and so beyond any instrumental relationship with things; second, we consider how sense is experienced through mood as well as our cognitive skills of manipulation ...... research by revisiting Weick’s seminal reading of Norman Maclean’s book surrounding the tragic events of a 1949 forest fire at Mann Gulch, USA....

Forensic computing} (sometimes also called digital forensics, computer forensics or IT forensics) is a branch of forensic science pertaining to digital evidence, i.e., any legal evidence that is processed by digital computer systems or stored on digital storage media. Forensic computing is a new discipline evolving within the intersection of several established research areas such as computer science, computer engineering and law. Forensic computing is rapidly gaining importance since the...

Glaciology began with a focus on understanding basic mechanical processes and producing physical models that could explain the principal observations. Recently, however, more attention has been paid to the wealth of recent observations, with many modeling efforts relying on data assimilation and empirical scalings, rather than being based on first-principles physics. Notably, ice sheet models commonly assume that subglacial friction is characterized by a "slipperiness" coefficient that is determined by inverting surface velocity observations. Predictions are usually then made by assuming these slipperiness coefficients are spatially and temporally fixed. However, this is only valid if slipperiness is an unchanging material property of the bed and, despite decades of work on subglacial friction, it has remained unclear how to best account for such subglacial physics in ice sheet models. Here, we describe how basic seismological concepts and observations can be used to improve our understanding and determination of subglacial friction. First, we discuss how standard models of granular friction can and should be used in basal friction laws for marine ice sheets, where very low effective pressures exist. We show that under realistic West Antarctic Ice Sheet conditions, standard Coulomb friction should apply in a relatively narrow zone near the grounding line and that this should transition abruptly as one moves inland to a different, perhaps Weertman-style, dependence of subglacial stress on velocity. We show that this subglacial friction law predicts significantly different ice sheet behavior even as compared with other friction laws that include effective pressure. Secondly, we explain how seismological observations of water flow noise and basal icequakes constrain subglacial physics in important ways. Seismically observed water flow noise can provide constraints on water pressures and channel sizes and geometry, leading to important data on subglacial friction

Full Text Available The development of live forensic acquisition in general presents a remedy for some of the problems introduced by traditional forensic acquisition. However, this live forensic acquisition introduces a variety of additional problems, unique...

The title of my bachelor work is ?Forensic linguistics: Applications of forensic linguistics methods to anonymous letters?. Forensic linguistics is young and not very known branch of applied linguistics. This bachelor work wants to introduce forensic linguistics and its method. The bachelor work has two parts ? theory and practice. The theoretical part informs about forensic linguistics in general. Its two basic aspects utilized in forensic science and respective methods. The practical part t...

Full Text Available The application schema layer of a Database Management System (DBMS can be modified to deliver results that may warrant a forensic investigation. Table structures can be corrupted by changing the metadata of a database or operators of the database can be altered to deliver incorrect results when used in queries. This paper will discuss categories of possibilities that exist to alter the application schema with some practical examples. Two forensic environments are introduced where a forensic investigation can take place in. Arguments are provided why these environments are important. Methods are presented how these environments can be achieved for the application schema layer of a DBMS. A process is proposed on how forensic evidence should be extracted from the application schema layer of a DBMS. The application schema forensic evidence identification process can be applied to a wide range of forensic settings.

One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

Seismology as a science is driven by observing and understanding data and it is thus vital to make this as easy and accessible as possible. The growing volume of freely available data coupled with ever expanding computational power enables scientists to take on new and bigger problems. This evolution is to some part hindered as existing data formats have not been designed with it in mind. We present ASDF (http://seismic-data.org), the Adaptable Seismic Data Format, a novel, modern, and especially practical data format for all branches of seismology with particular focus on how it is incorporated into seismic full waveform inversion workflows. The format aims to solve five key issues: Efficiency: Fast I/O operations especially in high performance computing environments, especially limiting the total number of files. Data organization: Different types of data are needed for a variety of tasks. This results in ad hoc data organization and formats that are hard to maintain, integrate, reproduce, and exchange. Data exchange: We want to exchange complex and complete data sets. Reproducibility: Oftentimes just not existing but crucial to advance our science. Mining, visualization, and understanding of data: As data volumes grow, more complex, new techniques to query and visualize large datasets are needed. ASDF tackles these by defining a structure on top of HDF5 reusing as many existing standards (QuakeML, StationXML, PROV) as possible. An essential trait of ASDF is that it empowers the construction of completely self-describing data sets including waveform, station, and event data together with non-waveform data and a provenance description of everything. This for example for the first time enables the proper archival and exchange of processed or synthetic waveforms. To aid community adoption we developed mature tools in Python as well as in C and Fortran. Additionally we provide a formal definition of the format, a validation tool, and integration into widely used

One of the seismological programs to manipulate seismic data is SGRAPH program. It consists of integrated tools to perform advanced seismological techniques. SGRAPH is considered a new system for maintaining and analyze seismic waveform data in a stand-alone Windows-based application that manipulate a wide range of data formats. SGRAPH was described in detail in the first part of this paper. In this part, I discuss the advanced techniques including in the program and its applications in seismology. Because of the numerous tools included in the program, only SGRAPH is sufficient to perform the basic waveform analysis and to solve advanced seismological problems. In the first part of this paper, the application of the source parameters estimation and hypocentral location was given. Here, I discuss SGRAPH waveform modeling tools. This paper exhibits examples of how to apply the SGRAPH tools to perform waveform modeling for estimating the focal mechanism and crustal structure of local earthquakes.

As threats against digital assets have risen and there is necessitate exposing and eliminating hidden risks and threats. The ability of exposing is called “cyber forensics.” Cyber Penetrators have adopted more sophistical tools and tactics that endanger the operations of the global phenomena. These attackers are also using anti-forensic techniques to hide evidence of a cyber crime. Cyber forensics tools must increase its toughness and counteract these advanced persistent threats. This paper f...

Since the introduction in the mid-1980s of analyses of minisatellites for DNA analyses, a revolution has taken place in forensic genetics. The subsequent invention of the PCR made it possible to develop forensic genetics tools that allow both very informative routine investigations and still more...... and more advanced, special investigations in cases concerning crime, paternity, relationship, disaster victim identification etc. The present review gives an update on the use of DNA investigations in forensic genetics.......Since the introduction in the mid-1980s of analyses of minisatellites for DNA analyses, a revolution has taken place in forensic genetics. The subsequent invention of the PCR made it possible to develop forensic genetics tools that allow both very informative routine investigations and still more...

This thesis aims to describe a process framework suitable for conducting digital forensics investigation projects as support for forensic audit. Selection of existing digital forensics investigation framework was a subject of criterial comparison. Described new framework is a result of combination and enhancement of those frameworks, which were suitable for the characteristics of forensic audit. Thesis also discusses digital forensics methods for fraud examination and risk assessment as a par...

Developments in forensic mass spectrometry tend to follow, rather than lead, the developments in other disciplines. Examples of techniques having forensic potential born independently of forensic applications include ambient ionization, imaging mass spectrometry, isotope ratio mass spectrometry, portable mass spectrometers, and hyphenated chromatography-mass spectrometry instruments, to name a few. Forensic science has the potential to benefit enormously from developments that are funded by other means, if only the infrastructure and personnel existed to adopt, validate, and implement the new technologies into casework. Perhaps one unique area in which forensic science is at the cutting edge is in the area of chemometrics and the determination of likelihood ratios for the evaluation of the weight of evidence. Such statistical techniques have been developed most extensively for ignitable-liquid residue analyses and isotope ratio analysis. This review attempts to capture the trends, motivating forces, and likely impact of developing areas of forensic mass spectrometry, with the caveat that none of this research is likely to have any real impact in the forensic community unless: (a) The instruments developed are turned into robust black boxes with red and green lights for positives and negatives, respectively, or (b) there are PhD graduates in the workforce who can help adopt these sophisticated techniques.

One of the primary goals of educational seismology programs is to bring inquiry-based research to the middle- and high-school classroom setting. Although it is often stated as a long-term goal of science outreach programs, in practice there are many barriers to research in the school setting, among them increasing emphasis on test-oriented training, decreasing interest and participation in science fairs, limited teacher confidence and experience for mentoring research, insufficient student preparedness for research projects, and the short term of university involvement (typically limited to brief one-day encounters). For the past three+ years we have tried to address these issues through a focused outreach program we have called the PEPP Research Fellows Program. This is treated as an honors program in which high school teachers in our group nominate students with interests in science careers. These students are invited to participate in the program, and those who elect to take part participate in a one-day education and training session in the fall. Rather than leave research projects completely open, we direct the students at toward one of two specific, group-oriented projects (in our case, one focusing on local recordings of mining explosions, and a second on teleseismic body-wave analysis), but we encourage them to act as independent researchers and follow topics of interest. The students then work on seismic data from the local educational network or from the IRIS facilities. Following several months of informal interaction with teachers and students (email, web conferencing, etc.), we bring the students and teachers to our university for a weekend research symposium in the spring. Students present their work in oral or poster form and prizes are given for the best papers. Projects range from highly local projects (records of seismic noise at school X) to larger-scale regional projects (analysis of teleseismic P-wave delays at PEPP network stations) From 20 to

The last decade demonstrated that seismic waves and tsunamis are coupled to the ionosphere. Observations of Total Electron Content (TEC) and airglow perturbations of unique quality and amplitude were made during the Tohoku, 2011 giant Japan quake, and observations of much lower tsunamis down to a few cm in sea uplift are now routinely done, including for the Kuril 2006, Samoa 2009, Chili 2010, Haida Gwai 2012 tsunamis. This new branch of seismology is now mature enough to tackle the new challenge associated to the inversion of these data, with either the goal to provide from these data maps or profile of the earth surface vertical displacement (and therefore crucial information for tsunami warning system) or inversion, with ground and ionospheric data set, of the various parameters (atmospheric sound speed, viscosity, collision frequencies) controlling the coupling between the surface, lower atmosphere and the ionosphere. We first present the state of the art in the modeling of the tsunami-atmospheric coupling, including in terms of slight perturbation in the tsunami phase and group velocity and dependance of the coupling strength with local time, ocean depth and season. We then show the confrontation of modelled signals with observations. For tsunami, this is made with the different type of measurement having proven ionospheric tsunami detection over the last 5 years (ground and space GPS, Airglow), while we focus on GPS and GOCE observation for seismic waves. These observation systems allowed to track the propagation of the signal from the ground (with GPS and seismometers) to the neutral atmosphere (with infrasound sensors and GOCE drag measurement) to the ionosphere (with GPS TEC and airglow among other ionospheric sounding techniques). Modelling with different techniques (normal modes, spectral element methods, finite differences) are used and shown. While the fits of the waveform are generally very good, we analyse the differences and draw direction of future

The U.S. Geological Survey's Albuquerque Seismological Laboratory (ASL) has several efforts underway to improve data quality at its stations. The Data Quality Analyzer (DQA) is one such development. The DQA is designed to characterize station data quality in a quantitative and automated manner. Station quality is based on the evaluation of various metrics, such as timing quality, noise levels, sensor coherence, and so on. These metrics are aggregated into a measurable grade for each station. The DQA consists of a website, a metric calculator (Seedscan), and a PostgreSQL database. The website allows the user to make requests for various time periods, review specific networks and stations, adjust weighting of the station's grade, and plot metrics as a function of time. The website dynamically loads all station data from a PostgreSQL database. The database is central to the application; it acts as a hub where metric values and limited station descriptions are stored. Data is stored at the level of one sensor's channel per day. The database is populated by Seedscan. Seedscan reads and processes miniSEED data, to generate metric values. Seedscan, written in Java, compares hashes of metadata and data to detect changes and perform subsequent recalculations. This ensures that the metric values are up to date and accurate. Seedscan can be run in a scheduled task or on demand by way of a config file. It will compute metrics specified in its configuration file. While many metrics are currently in development, some are completed and being actively used. These include: availability, timing quality, gap count, deviation from the New Low Noise Model, deviation from a station's noise baseline, inter-sensor coherence, and data-synthetic fits. In all, 20 metrics are planned, but any number could be added. ASL is actively using the DQA on a daily basis for station diagnostics and evaluation. As Seedscan is scheduled to run every night, data quality analysts are able to then use the

Forensic DNA testing has a number of applications, including parentage testing, identifying human remains from natural or man-made disasters or terrorist attacks, and solving crimes. This article provides background information followed by an overview of the process of forensic DNA testing, including sample collection, DNA extraction, PCR amplification, short tandem repeat (STR) allele separation and sizing, typing and profile interpretation, statistical analysis, and quality assurance. The article concludes with discussions of possible problems with the data and other forensic DNA testing techniques.

Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information ROS completes a 3D simulation real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

This popular article in Physics World reviews the application of Fourier Transform Infrared Spectromicroscopy to Forensics, and predicts further applications due to the high inherent signal to noise available for FTIR microscopy at synchrotron sources

Nuclear forensics is the investigation of nuclear materials to find evidence for example the source, the trafficking, and the enrichment of the material. The material can be recovered from various sources including dust from the vicinity of a nuclear facility, or from the radioactive debris following a nuclear explosion. Results of nuclear forensic testing are used by different organizations to make decisions. The information is typically combined with other sources of information such as law enforcement and intelligence information

This paper describes procedures for conducting forensic examinations of Apple Macs running Mac OS X. The target disk mode is used to create a forensic duplicate of a Mac hard drive and preview it. Procedures are discussed for recovering evidence from allocated space, unallocated space, slack space and virtual memory. Furthermore, procedures are described for recovering trace evidence from Mac OS X default email, web browser and instant messaging applications, as well as evidence pertaining to commands executed from a terminal.

Approved for public release; distribution is unlimited Computer Forensics involves the preservation, identification, extraction and documentation of computer evidence stored in the form of magnetically encoded information. With the proliferation of E-commerce initiatives and the increasing criminal activities on the web, this area of study is catching on in the IT industry and among the law enforcement agencies. The objective of the study is to explore the techniques of computer forensics ...

accepting such complex moves need not be hand-designed. Instead, they are automatically determined by the underlying probabilistic model, which is in turn calibrated via historical data and scientific knowledge. Consider a small seismic event which generates weak signals at several different stations, which might independently be mistaken for noise. A birth move may nevertheless hypothesize an event jointly explaining these detections. If the corresponding waveform data then aligns with the seismological knowledge encoded in the probabilistic model, the event may be detected even though no single station observes it unambiguously. Alternatively, if a large outlier reading is produced at a single station, moves which instantiate a corresponding (false) event would be rejected because of the absence of plausible detections at other sensors. More broadly, one of the main advantages of our MCMC approach is its consistent handling of the relative uncertainties in different information sources. By avoiding low-level thresholds, we expect to improve accuracy and robustness. At the conference, we will present results quantitatively validating our approach, using ground-truth associations and locations provided either by simulation or human analysts.

The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

The dominant conception of forensic science as a patchwork of disciplines primarily assisting the criminal justice system (i.e. forensics) is in crisis or at least shows a series of anomalies and serious limitations. In recent years, symptoms of the crisis have been discussed in a number of reports by various commentators, without a doubt epitomized by the 2009 report by the US National Academies of Sciences (NAS 2009 Strengthening forensic science in the United States: a path forward). Although needed, but viewed as the solution to these drawbacks, the almost generalized adoption of stricter business models in forensic science casework compounded with ever-increasing normative and compliance processes not only place additional pressures on a discipline that already appears in difficulty, but also induce more fragmentation of the different forensic science tasks, a tenet many times denounced by the same NAS report and other similar reviews. One may ask whether these issues are not simply the result of an unfit paradigm. If this is the case, the current problems faced by forensic science may indicate future significant changes for the discipline. To facilitate broader discussion this presentation focuses on trace evidence, an area that is seminal to forensic science both for epistemological and historical reasons. There is, however, little doubt that this area is currently under siege worldwide. Current and future challenges faced by trace evidence are discussed along with some possible answers. The current situation ultimately presents some significant opportunities to re-invent not only trace evidence but also forensic science. Ultimately, a distinctive, more robust and more reliable science may emerge through rethinking the forensics paradigm built on specialisms, revisiting fundamental forensic science principles and adapting them to the twenty-first century. PMID:26101285

Computational Intelligence techniques have been widely explored in various domains including forensics. Analysis in forensic encompasses the study of pattern analysis that answer the question of interest in security, medical, legal, genetic studies and etc. However, forensic analysis is usually performed through experiments in lab which is expensive both in cost and time. Therefore, this book seeks to explore the progress and advancement of computational intelligence technique in different focus areas of forensic studies. This aims to build stronger connection between computer scientists and forensic field experts. This book, Computational Intelligence in Digital Forensics: Forensic Investigation and Applications, is the first volume in the Intelligent Systems Reference Library series. The book presents original research results and innovative applications of computational intelligence in digital forensics. This edited volume contains seventeen chapters and presents the latest state-of-the-art advancement ...

Seismological training tends to occur within the isolation of a particular institution with a limited set of tools (codes, libraries) that are often not transferrable outside. Here, we propose to overcome these limitations with a community-driven library of Jupyter notebooks dedicated to training on any aspect of seismology for purposes of education and outreach, on-site or archived tutorials for codes, classroom instruction, and research. A Jupyter notebook (jupyter.org) is an open-source interactive computational environment that allows combining code execution, rich text, mathematics, and plotting. It can be considered a platform that supports reproducible research, as all inputs and outputs may be stored. Text, external graphics, equations can be handled using Markdown (incl. LaTeX) format. Jupyter notebooks are driven by standard web browsers, can be easily exchanged in text format, or converted to other documents (e.g. PDF, slide shows). They provide an ideal format for practical training in seismology. A pilot-platform was setup with a dedicated server such that the Jupyter notebooks can be run in any browser (PC, notepad, smartphone). We show the functionalities of the Seismo-Live platform with examples from computational seismology, seismic data access and processing using the ObsPy library, seismic inverse problems, and others. The current examples are all using the Python programming language but any free language can be used. Potentially, such community platforms could be integrated with the EPOS-IT infrastructure and extended to other fields of Earth sciences.

This paper draws a comparison of fundamental theories in traditional forensic science and the state of the art in current computer forensics, thereby identifying a certain disproportion between the perception of central aspects in common theory and the digital forensics reality. We propose a separation of what is currently demanded of practitioners in digital forensics into a rigorous scientific part on the one hand, and a more general methodology of searching and seizing digital evidence an...

Full Text Available With the explosion of digital crime, digital forensics is more often applied. The digital forensic discipline developed rather rapidly, but up to date very little international standardization with regard to processes, procedures or management has...

Part 5: CLOUD FORENSICS; International audience; The rapid migration from traditional computing and storage models to cloud computing environments has made it necessary to support reliable forensic investigations in the cloud. However, current cloud computing environments often lack support for forensic investigations and the trustworthiness of evidence is often questionable because of the possibility of collusion between dishonest cloud providers, users and forensic investigators. This chapt...

In the past 50 years forensic psychological practice has expanded dramatically. Because the practice of forensic psychology differs in important ways from more traditional practice areas (Monahan, 1980) the "Specialty Guidelines for Forensic Psychologists" were developed and published in 1991 (Committee on Ethical Guidelines for Forensic…

Human genetic variation is a major resource in forensics, but does not allow all forensically relevant questions to be answered. Some questions may instead be addressable via epigenomics, as the epigenome acts as an interphase between the fixed genome and the dynamic environment. We envision future forensic applications of DNA methylation analysis that will broaden DNA-based forensic intelligence. Together with genetic prediction of appearance and biogeographic ancestry, epigenomic lifestyle prediction is expected to increase the ability of police to find unknown perpetrators of crime who are not identifiable using current forensic DNA profiling.

In the present talk the fundamentals of the nuclear forensic investigations will be discussed followed by the detailed standard operating procedure (SOP) for the nuclear forensic analysis. The characteristics, such as, dimensions, particle size, elemental and isotopic composition help the nuclear forensic analyst in source attribution of the interdicted material, as the specifications of the nuclear materials used by different countries are different. The analysis of elemental composition could be done by SEM-EDS, XRF, CHNS analyser, etc. depending upon the type of the material. Often the trace constituents (analysed by ICP-AES, ICP-MS, AAS, etc) provide valuable information about the processes followed during the production of the material. Likewise the isotopic composition determined by thermal ionization mass spectrometry provides useful information about the enrichment of the nuclear fuel and hence its intended use

Full Text Available Researchers can spend their time reverse engineering, performing reverse analysis, or making substantive contributions to digital forensics science. Although work in all of these areas is important, it is the scientific breakthroughs that are the most critical for addressing the challenges that we face.Reverse Engineering is the traditional bread-and-butter of digital forensics research. Companies like Microsoft and Apple deliver computational artifacts (operating systems, applications and phones to the commercial market. These artifacts are bought and used by billions. Some have evil intent, and (if society is lucky, the computers end up in the hands of law enforcement. Unfortunately the original vendors rarely provide digital forensics tools that make their systems amenable to analysis by law enforcement. Hence the need for reverse engineering.(see PDF for full column

In a strange turn of history, the threat of global nuclear war has gone down, but the risk of a nuclear attack has gone up. The danger of nuclear terrorism and ways to thwart it, tackle it and manage it in the event of an attack is increasingly gaining the attention of nuclear analysts all over the world. There is rising awareness among nuclear experts to develop mechanisms to prevent, deter and deal with the threat of nuclear terrorism. Nuclear specialists are seeking to develop and improve the science of nuclear forensics so as to provide faster analysis during a crisis. Nuclear forensics can play an important role in detecting illicit nuclear materials to counter trafficking in nuclear and radiological materials. An effective nuclear forensic and attribution strategy can enable policy makers, decision makers and technical managers to respond to situations involving interception of special nuclear materials

Autopsy numbers in Australian hospitals have declined markedly during the past decade despite evidence of a relatively static rate of demonstrable clinical misdiagnosis during this time. The reason for this decrease in autopsy numbers is multifactorial and may include a general lack of clinical and pathologic interest in the autopsy with a possible decline in autopsy standard, a lack of clinicopathologic correlation after autopsies, and an increased emphasis on surgical biopsy reporting within hospital pathology departments. Although forensic autopsies are currently maintaining their numbers, it is incumbent on forensic pathologists to demonstrate the wealth of important information a carefully performed postmortem examination can reveal. To this end, the Pathology Division of the Victorian Institute of Forensic Medicine has instituted a program of minimum standards in varied types of coroner cases and commenced a system of internal and external audit. The minimum standard for a routine, sudden, presumed natural death is presented and the audit system is discussed.

The International Union of Geological Sciences (IUGS), Initiative on Forensic Geology (IFG) was set up in 2011 to promote and develop the applications of geology to policing and law enforcement throughout the world. This includes the provision of crime scene examinations, searches to locate graves or items of interest that have been buried beneath the ground surface as part of a criminal act and geological trace analysis and evidence. Forensic geologists may assist the police and law enforcement in a range of ways including for example; homicide, sexual assaults, counter terrorism, kidnapping, humanitarian incidents, environmental crimes, precious minerals theft, fakes and fraudulent crimes. The objective of this paper is to consider the geoethical aspects of forensic geology. This includes both delivery to research and teaching, and contribution to the practical applications of forensic geology in case work. The case examples cited are based on the personal experiences of the authors. Often, the technical and scientific aspect of forensic geology investigation may be the most straightforward, after all, this is what the forensic geologist has been trained to do. The associated geoethical issues can be the most challenging and complex to manage. Generally, forensic geologists are driven to carry-out their research or case work with integrity, honesty and in a manner that is law abiding, professional, socially acceptable and highly responsible. This is necessary in advising law enforcement organisations, society and the scientific community that they represent. As the science of forensic geology begins to advance around the world it is desirable to establish a standard set of principles, values and to provide an agreed ethical a framework. But what are these core values? Who is responsible for producing these? How may these become enforced? What happens when geoethical standards are breached? This paper does not attempt to provide all of the answers, as further work

Introduction Although effects of rotational motions due to earthquakes have long been observed (e. g., Mallet, 1862), nevertheless Richter (1958, p. 213) stated that: 'Perfectly general motion would also involve rotations about three perpendicular axes, and three more instruments for these. Theory indicates, and observation confirms, that such rotations are negligible.' However, Richter provided no references for this claim. Seismology is based primarily on the observation and modeling of three-component translational ground motions. Nevertheless, theoretical seismologists (e.g., Aki and Richards, 1980, 2002) have argued for decades that the rotational part of ground motions should also be recorded. It is well known that standard seismometers are quite sensitive to rotations and therefore subject to rotation-induced errors. The paucity of observations of rotational motions is mainly the result of a lack, until recently, of affordable rotational sensors of sufficient resolution. Nevertheless, in the past decade, a number of authors have reported direct observations of rotational motions and rotations inferred from rigid-body rotations in short baseline accelerometer arrays, creating a burgeoning library of rotational data. For example, ring laser gyros in Germany and New Zealand have led to the first significant and consistent observations of rotational motions from distant earthquakes (Igel et al., 2005, 2007). A monograph on Earthquake Source Asymmetry, Structural Media and Rotation Effects was published recently as well by Teisseyre et al. (2006). Measurement of rotational motions has implications for: (1) recovering the complete ground-displacement history from seismometer recordings; (2) further constraining earthquake rupture properties; (3) extracting information about subsurface properties; and (4) providing additional ground motion information to earthquake engineers for seismic design. A special session on Rotational Motions in Seismology was convened by H

Nuclear forensics corresponds to the forensic analysis of nuclear materials. The samples analysed may either be those that are confiscated during any act of smuggling or that is retrieved from a postexplosion debris. The characterisation of the material is based on the isotopic composition, physical and chemical compositions, age and history of the material which are determined by suitable analytical techniques. The interpretation of the analytical results is necessary to understand the details of the material such as its provenance, the industrial history of the material as well as the implications of the probable use of the material

Full Text Available The Amazon Kindle eBook reader supports a wide range of capabilities beyond reading books. This functionality includes an inbuilt cellular data connection known as Whispernet. The Kindle provides web browsing, an application framework, eBook delivery and other services over this connection. The historic data left by user interaction with this device may be of forensic interest. Analysis of the Amazon Kindle device has resulted in a method to reliably extract and interpret data from these devices in a forensically complete manner.

The lithospheric study and the identification of relevant lateral heterogeneities in the Antarctic continent and borderlands, is essential to understand the geodynamic evolution both of the continental and oceanic bordering regions. The complexity of the geological evolution and the structural properties of the lithosphere in the Scotia area have been stressed by many authors. The present setting of the area is the result of the mutual interaction among the Antarctic, South American and several minor plants whose geodynamic history and actual boundaries are still partially unknown. The intense seismic activity that characterizes the region encourages the use of the seismological approach to investigate the lithospheric structure of the area. Since January 1992 a broad band three components station is operating at the Antarctic base Esperanza in the NE area of Antarctic Peninsula. The station has been installed with financial support of the Italian Programma Nazionale di Ricerche in Antartide (PNRA) by Osservatorio Geofisico Sperimentale (OGS) and Instituto Antartico Argentino (IAA). Russi et al. (1994) have analyzed selected recordings using the frequency-time analysis (FTAN) method obtaining some relevant information on the large scale structure of the lithosphere in the Scotia region even if data recorded by a single station were available. The extension of our analysis to further events and to horizontal component records is here presented. Within the framework of the international co-operation to the Antarctic Seismographic Network, the OGS and the IAA are upgrading the Esperanza station and installing an additional broad band station near the town of Ushuaia (Tierra del Fuego, Argentina) with the financial support of PNRA. The inversion of the dispersion curves through the FTAN of the signals recorded by an increased number of stations and generated by events with source-station paths spanning the region will allow us to extract the elastic and anelastic

We propose an extensible format-definition for seismic data (QuakeML). Sharing data and seismic information efficiently is one of the most important issues for research and observational seismology in the future. The eXtensible Markup Language (XML) is playing an increasingly important role in the exchange of a variety of data. Due to its extensible definition capabilities, its wide acceptance and the existing large number of utilities and libraries for XML, a structured representation of various types of seismological data should in our opinion be developed by defining a 'QuakeML' standard. Here we present the QuakeML definitions for parameter databases and further efforts, e.g. a central QuakeML catalog database and a web portal for exchanging codes and stylesheets.

Forensic science is broadly defined as the application of science to matters of the law. Practitioners typically use multidisciplinary scientific techniques for the analysis of physical evidence in an attempt to establish or exclude an association between a suspect and the scene of a crime.

The objective of this presentation is to share three case studies from the Institute of Transuranium Elements (ITU) which describe the application of nuclear forensics to events where nuclear and other radioactive material was found to be out of regulatory control

The aim of the investigation is to define as clearly as possible specific forensic psychiatric characteristics of persons who committed homicide and or attempted due to jealousy (the nature and severity of psychopathology, the level of responsibility, danger for the community, intensity and nature of aggression, the victimologic dimension, the relation of alcohol and jealousy). A retrospective method based on forensic psychiatric expertises in the period 1975-1999 was used. They encompassed 200 examinees that committed murder or attempted it. The results show the connection of psychotic jealousy with the highest degree of danger in diagnostic categories of paranoid psychosis and paranoid schizophrenia. The time span from the first manifestations of jealousy until the actual commitment of a crime is the longest in personality disorders and the shortest in schizophrenia. Exogenous provoking situations were dominant for committing homicide due to jealousy in personality disorders. Acute alcohol intoxication has a specific significance in crime due to jealousy in the same diagnostic category. Clear criteria were designed for forensic psychiatric evaluation of murder and attempts of homicide caused by jealousy, which will be of help in everyday practice in the field forensic work and treatment.

differences. CONCLUSIONS: Noninvasive in situ PMCT methods for organ measuring, as performed in this study, are not useful tools in forensic pathology. The best method to estimate organ volume is a CT-scan of the eviscerated organ. PMCT-determined CTR seems to be useless for ascertaining cardiomegaly...

The forensic potential of soil and geological evidence has been recognized for more than a century, but in the last 15 years these types of evidence have been used much more widely both as an investigative intelligence tool and as evidence in court.

Exploring the internal structure of planetary objects is fundamental to understand the evolution of our solar system. In contrast to Earth, planetary seismology is hampered by the limited number of stations available, often just a single one. Classic seismology is based on the measurement of three components of translational ground motion. Its methods are mainly developed for a larger number of available stations. Therefore, the application of classical seismological methods to other planets is very limited. Here, we show that the additional measurement of three components of rotational ground motion could substantially improve the situation. From sparse or single station networks measuring translational and rotational ground motions it is possible to obtain additional information on structure and source. This includes direct information on local subsurface seismic velocities, separation of seismic phases, propagation direction of seismic energy, crustal scattering properties, as well as moment tensor source parameters for regional sources. The potential of this methodology will be highlighted through synthetic forward and inverse modeling experiments.

In recent years the Python ecosystem evolved into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community-driven, open-source project dedicated to providing a bridge for seismology into that ecosystem. It does so by offering Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than seven years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases. Additionally we will discuss the road ahead as well as the long-term sustainability of open-source scientific software.

The fundamentals of oxidative phosphorylation and photophosphorylation are revisited. New experimental data on the involvement of succinate and malate anions respectively in oxidative phosphorylation and photophosphorylation are presented. These new data offer a novel molecular mechanistic...

Thirty-five experts in the fields of geology, geophysics, and engineering, from academia, government, and industry, were invited to participate in a workshop and address the many problems of national and global concern that require seismological expertise for their solutions. This report reviews the history, accomplishments, and status of seismology; assesses changing trends in seismological research and applications; and recommends future directions in the light of these changes and of the growing needs of society in areas in which seismology can make significant contributions. The first part of the volume discusses areas of opportunity (understanding earthquakes and reducing their hazards; exploration, energy, and resources; understanding the earth and planets) and realizing the benefits (the roles of Federal, state, and local governments, industry, and universities). The second part, Background and Progress, briefly considers each of the following topics: the birth and early growth of seismology, nuclear test monitoring and its scientific ramifications, instrumentation and data processing, geodynamics and plate tectonics, theoretical seismology, structure and composition of the earth, exploration seismology, seismic exploration for minerals, earthquake source mechanism studies, engineering seismology, strong ground motion and related earthquake hazards, volcanoes, tsunamis, planetary seismology, and international aspects of seismology. 26 figures. (RWR)

This paper reports on an investigation into the skills and competencies of forensic learning disability nurses in the United Kingdom. The two sample populations were forensic learning disability nurses from the high, medium, and low secure psychiatric services and non-forensic learning disability nurses from generic services. An information gathering schedule was used to collect the data; of 1200 schedules, 643 were returned for a response rate of 53.5%. The data identified the "top ten" problems that forensic learning disability nurses may encounter, the skills and competencies necessary to overcome them, and the areas that need to be developed in the future. The results indicated that the forensic learning disability nurses tended to focus on the physical aspects to the role whilst the non-forensic learning disability nurses tended to perceive the forensic role in relational terms. This has implications for practice, policy, and procedures.

Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

Full Text Available ways in which such an acquisition should take place to ensure forensic soundness. The study presents information on a relatively new field of expertise and considers the Digital Forensic discipline, forensic tools, practical problems experienced during...

Easy, efficient and comprehensive access to data, data products, scientific services and scientific software is a key ingredient in enabling research at the frontiers of science. Organizing this access across the European Research Infrastructures in the field of seismology, so that it best serves user needs, takes advantage of state-of-the-art ICT solutions, provides cross-domain interoperability, and is organizationally and financially sustainable in the long term, is the core challenge of the implementation phase of the Thematic Core Service (TCS) Seismology within the EPOS-IP project. Building upon the existing European-level infrastructures ORFEUS for seismological waveforms, EMSC for seismological products, and EFEHR for seismological hazard and risk information, and implementing a pilot Computational Earth Science service starting from the results of the VERCE project, the work within the EPOS-IP project focuses on improving and extending the existing services, aligning them with global developments, to at the end produce a well coordinated framework that is technically, organizationally, and financially integrated with the EPOS architecture. This framework needs to respect the roles and responsibilities of the underlying national research infrastructures that are the data owners and main providers of data and products, and allow for active input and feedback from the (scientific) user community. At the same time, it needs to remain flexible enough to cope with unavoidable challenges in the availability of resources and dynamics of contributors. The technical work during the next years is organized in four areas: - constructing the next generation software architecture for the European Integrated (waveform) Data Archive EIDA, developing advanced metadata and station information services, fully integrate strong motion waveforms and derived parametric engineering-domain data, and advancing the integration of mobile (temporary) networks and OBS deployments in

In this edited volume on advances in forensic geotechnical engineering, a number of technical contributions by experts and professionals in this area are included. The work is the outcome of deliberations at various conferences in the area conducted by Prof. G.L. Sivakumar Babu and Dr. V.V.S. Rao as secretary and Chairman of Technical Committee on Forensic Geotechnical Engineering of International Society for Soil Mechanics and Foundation Engineering (ISSMGE). This volume contains papers on topics such as guidelines, evidence/data collection, distress characterization, use of diagnostic tests (laboratory and field tests), back analysis, failure hypothesis formulation, role of instrumentation and sensor-based technologies, risk analysis, technical shortcomings. This volume will prove useful to researchers and practitioners alike.

Lack of Cyber Forensics experts is a huge challenge facing the world today. It comes due to the fancy of Cyber Forensics training or education. The multidisciplinary nature of Cyber Forensics proliferates to diverse training programmes, from a handful dayâ€Ÿs workshop to Postgraduate in Cyber Forensics. Consequently, this paper concentrates on analyzing the Cyber Forensics training programmes in terms of Competency-Based Framework. The study proves that Cyber Forensics training or education h...

The dissertation concerns digital forensic. The expression digital forensic (sometimes called digital forensic science) is the science that studies the identification, storage, protection, retrieval, documentation, use, and every other form of computer data processing in order to be evaluated in a legal trial. Digital forensic is a branch of forensic science. First of all, digital forensic represents the extension of theories, principles and procedures that are typical and importa...

Full Text Available Radiography can play an important part in forensic odontology, mainly to establish identification. This may take the precise form of comparison between antemortem and postmortem radiographs. Radiographs may also be taken to determine the age of a minor victim and even help in the assessment of the sex and ethnic group. Comparable radiographs are an essential factor to confirm identification in a mass disaster.

Introduction. Rape is a sexual act of violence in which physical strength is used. Criminal law imposes strict punishments for such crimes as rape. Psycho-pathologically, rape is among the gravest of crimes, often associated with extremely deviated behavior. This article deals with the forensic aspects of sexual violence in Bosnia and Herzegovina in the period from 2000-2004. We report about sexual assaults, personality of delinquents, motives and consequences of rape. Material and Methods. T...

The progress of forensic neutron activation analysis (FNAA) in Japan is described. FNAA began in 1965 and during the past 20 years many cases have been handled; these include determination of toxic materials, comparison examination of physical evidences (e.g., paints, metal fragments, plastics and inks) and drug sample differentiation. Neutron activation analysis is applied routinely to the scientific criminal investigation as one of multielement analytical techniques. This paper also discusses these routine works. (author) 14 refs

Tattooing refers to marking of the skin by puncturing and introducing pigmented material. Although it derives from a Polynesian word, tautau, decorative tattooing has been found in most societies over many centuries. The purpose of tattooing has varied from simple decoration, to a marker of social rank, criminal and noncriminal group membership, or a particular rite of passage in tribal communities. Tattooing may be used in medicine to mark areas for radiotherapy, and may occur inadvertently associated with certain occupations such as coal mining. Forensically, tattoos may be very useful in assisting with body identification if facial features or fingers have been damaged or removed. Aspects of a decedent's history may also be deduced from certain tattoos such as military tattoos in service personnel, rudimentary line tattoos with antisocial and anti-police messages in ex-prisoners, and syringes, marihuana leaves or mushrooms in illicit drug users. Tattoos have become more common in recent years in younger individuals in the West and so should be expected to be found with increasing incidence at the time of forensic autopsy examinations. Increasing population movements also mean that less common tattoos may be encountered during forensic evaluations.

The "Seismological Grand Challenges in Understanding Earth's Dynamic Systems," a community-written long-range science plan for the next decade, poses 10 questions to guide fundamental seismological research. Written in an approachable fashion suitable for policymakers, the broad questions and supporting discussion contained in this document offer an ideal framework for the development of undergraduate curricular materials. Leveraging this document, we have created a collection of inquiry-based classroom modules that utilize authentic data to modernize seismological instruction in 100 and 200 level undergraduate courses. The modules not only introduce undergraduates to the broad questions that the seismological community seeks to answer in the future but also showcase the numerous areas where modern seismological research is actively contributing to our understanding of fundamental Earth processes. To date 6 in-depth explorations that correspond to the Grand Challenges document have been developed. The specific topics for each exploration were selected to showcase modern seismological research while also covering topics commonly included in the curriculum of these introductory classes. Examples of activities that have been created and their corresponding Grand Challenge include: -A guided inquiry that introduces students to episodic tremor and slip and compares the GPS and seismic signatures of ETS with those produced from standard tectonic earthquakes (Grand Challenge "How do faults slip?"). - A laboratory exercise where students engage in b-value mapping of volcanic earthquakes to assess potential eruption hazards (How do magmas ascend and erupt?). - A module that introduce students to glacial earthquakes in Greenland and compares their frequency and spatial distribution to tectonic earthquakes (How do processes in the ocean and atmosphere interact with the solid Earth?). What is the relationship between stress and strain in the lithosphere? - An activity that

The United States academic seismology community, through the National Science Foundation (NSF)-funded Incorporated Research Institutions for Seismology (IRIS) Consortium, has promoted and encouraged a rich environment of innovation and experimentation in areas such as seismic instrumentation, data processing and analysis, teaching and curriculum development, and academic science. As the science continually evolves, IRIS helps drive the market for new research tools that enable science by establishing a variety of standards and goals. This has often involved working directly with manufacturers to better define the technology required, co-funding key development work or early production prototypes, and purchasing initial production runs. IRIS activities have helped establish de-facto international standards and impacted the commercial sector in areas such as seismic instrumentation, open-access data management, and professional development. Key institutional practices, conducted and refined over IRIS' thirty-year history of operations, have focused on open-access data availability, full retention of maximum-bandwidth, continuous data, and direct community access to state-of-the-art seismological instrumentation and software. These practices have helped to cultivate and support a thriving commercial ecosystem, and have been a key element in the professional development of multiple generations of seismologists who now work in both industry and academia. Looking toward the future, IRIS is increasing its engagement with industry to better enable bi-directional exchange of techniques and technology, and enhancing the development of tomorrow's workforce. In this presentation, we will illustrate how IRIS has promoted innovations grown out of the academic community and spurred technological advances in both academia and industry.

The Inge Lehmann archive contains thousands of seismological work documents from Inge Lehmann’s private home. For a long time the author thought that the main concern was to keep the documents for posterity. There is now a renewed interest in Inge Lehmann, and some documents were presented in a poster at ESC Potsdam 2004, and the collection of documents were scanned and catalogued 2005-2006 at Storia Geofisica Ambiente in Bologna. Inge Lehmann (1888-1993 is famous for her discovery in 1936 of the earth’s inner core and for work on the upper mantle. A short biography is given. After her retirement in 1953 she worked at home in Denmark, and abroad in USA and in Canada. She took part in the creation of the European Seismological Commission in 1951, and in the creation of the International Seismological Centre in 1964. Inge Lehmann received many awards. Some letters from her early correspondence with Harold Jeffreys are discussed, they show how the inner core was discussed already in 1932. A few of the author’s reminiscences of Inge Lehmann are given.

It is widely recognized that reproducibility is crucial to advance science, but at the same time it is very hard to actually achieve. This results in it being recognized but also mostly ignored by a large fraction of the community. A key ingredient towards full reproducibility is to capture and describe the history of data, an issue known as provenance. We present SEIS-PROV, a practical format and data model to store provenance information for seismological data. In a seismological context, provenance can be seen as information about the processes that generated and modified a particular piece of data. For synthetic waveforms the provenance information describes which solver and settings therein were used to generate it. When looking at processed seismograms, the provenance conveys information about the different time series analysis steps that led to it. Additional uses include the description of derived data types, such as cross-correlations and adjoint sources, enabling their proper storage and exchange. SEIS-PROV is based on W3C PROV (http://www.w3.org/TR/prov-overview/), a standard for generic provenance information. It then applies an additional set of constraints to make it suitable for seismology. We present a definition of the SEIS-PROV format, a way to check if any given file is a valid SEIS-PROV document, and two sample implementations: One in SPECFEM3D GLOBE (https://geodynamics.org/cig/software/specfem3d_globe/) to store the provenance information of synthetic seismograms and another one as part of the ObsPy (http://obspy.org) framework enabling automatic tracking of provenance information during a series of analysis and transformation stages. This, along with tools to visualize and interpret provenance graphs, offers a description of data history that can be readily tracked, stored, and exchanged.

Full Text Available The Domain Name Service (DNS is a critical core component of the global Internet and integral to the majority of corporate intranets. It provides resolution services between the human-readable name-based system addresses and the machine operable Internet Protocol (IP based addresses required for creating network level connections. Whilst structured as a globally dispersed resilient tree data structure, from the Global and Country Code Top Level Domains (gTLD/ccTLD down to the individual site and system leaf nodes, it is highly resilient although vulnerable to various attacks, exploits and systematic failures.Â This paper examines the history along with the rapid growth of DNS up to its current critical status. It then explores the often overlooked value of DNS query data; from packet traces, DNS cache data, and DNS logs, with its use in System Forensics and more frequently in Network Forensics, extrapolating examples and experiments that enhance knowledge.Continuing on, it details the common attacks that can be used directly against the DNS systems and services, before following on with the malicious uses of DNS in direct system attacks, Distributed Denial of Service (DDoS, traditional Denial of Service (DOS attacks and malware. It explores both cyber-criminal activities and cyber-warfare based attacks, and also extrapolates from a number of more recent attacks the possible methods for data exfiltration. It explores some of the potential analytical methodologies including; common uses in Intrusion Detection Systems (IDS, as well as infection and activity tracking in malware traffic analysis, and covers some of the associated methods around technology designed to defend against, mitigate, and/or manage these and other risks, plus the effect that ISP and nation states can have by direct manipulation of DNS queries and return traffic.This paper also investigates potential behavioural analysis and time-lining, which can then be used for the

Facilities like EarthScope and IRIS/PASSCAL offer a framework in which to re-assess the role of our highest- resolution geophysical tool, controlled-source seismology. This tool is effective in near surface studies that focus on the upper 100 m of the crust to studies that focus on Moho structure and the lithospheric mantle. IRIS has now existed for over two decades and has transformed the way in which passive-source seismology in particular is carried out. Progress over these two decades has led to major discoveries about continental architecture and evolution through the development of three-dimensional images of the upper mantle and lithosphere. Simultaneously the hydrocarbon exploration industry has mapped increasingly large fractions of our sedimentary basins in three-dimensions and at unprecedented resolution and fidelity. Thanks to the additional instruments in the EarthScope facility, a clear scientific need and opportunity exists to map, at similar resolution, all of the crust - the igneous/metamorphic basement, the non-petroliferous basins that contain the record of continental evolution, and the seismogenic faults and active volcanoes that are the principal natural hazards we face. Controlled-source seismology remains the fundamental technology behind exploration for all fossil fuels and many water resources, and as such is a multi-billion-dollar industry centered in the USA. Academic scientists are leaders in developing the algorithms to process the most advanced industry data, but lack the academic data sets to which to apply this technology. University and government controlled-source seismologists, and their students who will populate the exploration industry, are increasingly divorced from that industry by their reliance on sparse spatial recording of usually only a single-component of the wavefield, generated by even sparser seismic sources. However, if we can find the resources, the technology now exists to provide seismic images of immense

This concise and systematic account of the current state of this new branch of astrophysics presents the theoretical foundations of plasma astrophysics, magneto-hydrodynamics and coronal magnetic structures, taking into account the full range of available observation techniques -- from radio to gamma. The book discusses stellar loops during flare energy releases, MHD waves and oscillations, plasma instabilities and heating and charged particle acceleration. Current trends and developments in MHD seismology of solar and stellar coronal plasma systems are also covered, while recent p

A short account is given of the development and operation of a unit within Blacknest which acts as a centre for handling data received from overseas seismological array stations and stations in the British Isles and also exchanges data with other centres. The work has been carried out as a long-term experiment to assess the capability of small networks of existing research and development stations to participate in the monitoring of a possible future Comprehensive Test Ban treaty (CTB) and to gain experience of the operational requirements for Data Centres. A preliminary assessment of a UK National Technical Means (NTM) for verifying a CTB is obtained inter alia. (author)

textabstractHuman genetic variation is a major resource in forensics, but does not allow all forensically relevant questions to be answered. Some questions may instead be addressable via epigenomics, as the epigenome acts as an interphase between the fixed genome and the dynamic environment. We

A critical review of the conceptual and practical evolution of forensic anthropology during the last two decades serves to identify two key external factors and four tightly inter-related internal methodological advances that have significantly affected the discipline. These key developments have not only altered the current practice of forensic anthropology, but also its goals, objectives, scope, and definition. The development of DNA analysis techniques served to undermine the classic role of forensic anthropology as a field almost exclusively focused on victim identification. The introduction of the Daubert criteria in the courtroom presentation of scientific testimony accompanied the development of new human comparative samples and tools for data analysis and sharing, resulting in a vastly enhanced role for quantitative methods in human skeletal analysis. Additionally, new questions asked of forensic anthropologists, beyond identity, required sound scientific bases and expanded the scope of the field. This environment favored the incipient development of the interrelated fields of forensic taphonomy, forensic archaeology, and forensic trauma analysis, fields concerned with the reconstruction of events surrounding death. Far from representing the mere addition of new methodological techniques, these disciplines (especially, forensic taphonomy) provide forensic anthropology with a new conceptual framework, which is broader, deeper, and more solidly entrenched in the natural sciences. It is argued that this new framework represents a true paradigm shift, as it modifies not only the way in which classic forensic anthropological questions are answered, but also the goals and tasks of forensic anthropologists, and their perception of what can be considered a legitimate question or problem to be answered within the field.

Forensic archaeology is an extremely powerful investigative discipline and, in combination with forensic anthropology, can provide a wealth of evidentiary information to police investigators and the forensic community. The re-emergence of forensic archaeology and anthropology within Australia relies on its diversification and cooperation with established forensic medical organizations, law enforcement forensic service divisions, and national forensic boards. This presents a unique opportunity to develop a new multidisciplinary approach to forensic archaeology/anthropology within Australia as we hold a unique set of environmental, social, and cultural conditions that diverge from overseas models and require different methodological approaches. In the current world political climate, more forensic techniques are being applied at scenes of mass disasters, genocide, and terrorism. This provides Australian forensic archaeology/anthropology with a unique opportunity to develop multidisciplinary models with contributions from psychological profiling, ballistics, sociopolitics, cultural anthropology, mortuary technicians, post-blast analysis, fire analysis, and other disciplines from the world of forensic science.

Nuclear forensics is the analysis of intercepted illicit nuclear or radioactive material and any associated material to provide evidence for nuclear attribution by determining origin, history, transit routes and purpose involving such material. Nuclear forensics activities include sampling of the illicit material, analysis of the samples and evaluation of the attribution by comparing the analysed data with database or numerical simulation. Because the nuclear forensics methodologies provide hints of the origin of the nuclear materials used in illegal dealings or nuclear terrorism, it contributes to identify and indict offenders, hence to enhance deterrent effect against such terrorism. Worldwide network on nuclear forensics can lead to strengthening global nuclear security regime. In the ESARDA Symposium 2015, the results of research and development of fundamental nuclear forensics technologies performed in Japan Atomic Energy Agency during the term of 2011-2013 were reported, namely (1) technique to analyse isotopic composition of nuclear material, (2) technique to identify the impurities contained in the material, (3) technique to determine the age of the purified material by measuring the isotopic ratio of daughter thorium to parent uranium, (4) technique to make image data by observing particle shapes with electron microscope, and (5) prototype nuclear forensics library for comparison of the analysed data with database in order to evaluate its evidence such as origin and history. Japan’s capability on nuclear forensics and effective international cooperation are also mentioned for contribution to the international nuclear forensics community.

In this paper we present a methodology and experimental results for evidence evaluation in the context of forensic face recognition. In forensic applications, the matching score (hereafter referred to as similarity score) from a biometric system must be represented as a Likelihood Ratio (LR). In our

Beside a few papers which focus on the forensic aspects of automatic face recognition, there is not much published about it in contrast to the literature on developing new techniques and methodologies for biometric face recognition. In this report, we review forensic facial identification which is

The improvements of automatic face recognition during the last 2 decades have disclosed new applications like border control and camera surveillance. A new application field is forensic face recognition. Traditionally, face recognition by human experts has been used in forensics, but now there is a

Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation and data processing technologies in research projects. At the same time well-engineered community codes make it easy to return results yet with the danger that the inherent traps of black-box solutions are not well understood. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations, with interactive, executable python codes. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing, noise analysis, and a variety of forward solvers for seismic wave propagation. In addition, an example is shown how Jupyter notebooks can be used to increase reproducibility of published results. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas. We present recent developments and new features.

Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.

Continuous seismic records near river channels can be used to quantify the energy induced by river sediment transport. During the 2011 typhoon season, we deployed a seismic array along the Chishan River in the mountain area of southern Taiwan, where there is strong variability in water discharge and high sedimentation rates. We observe hysteresis in the high-frequency (5-15 Hz) seismic noise level relative to the associated hydrological parameters. In addition, our seismic noise analysis reveals an asymmetry and a high coherence in noise cross-correlation functions for several station pairs during the typhoon passage, which corresponds to sediment particles and turbulent flows impacting along the riverbed where the river bends sharply. Based on spectral characteristics of the seismic records, we also detected 20 landslide/debris flow events, which we use to estimate the sediment supply. Comparison of sediment flux between seismologically determined bedload and derived suspended load indicates temporal changes in the sediment flux ratio, which imply a complex transition process from the bedload regime to the suspension regime between typhoon passage and off-typhoon periods. Our study demonstrates the possibility of seismologically monitoring river bedload transport, thus providing valuable additional information for studying fluvial bedrock erosion and mountain landscape evolution.

This paper provides an introduction to the discipline of Computer Forensics. With computers being involved in an increasing number, and type, of crimes the trace data left on electronic media can play a vital part in the legal process. To ensure acceptance by the courts, accepted processes and procedures have to be adopted and demonstrated which are not dissimilar to the issues surrounding traditional forensic investigations. This paper provides a straightforward overview of the three steps involved in the examination of digital media: Acquisition of data. Investigation of evidence. Reporting and presentation of evidence. Although many of the traditional readers of Medicine, Science and the Law are those involved in the biological aspects of forensics, I believe that both disciplines can learn from each other, with electronic evidence being more readily sought and considered by the legal community and the long, tried and tested scientific methods of the forensic community being shared and adopted by the computer forensic world.

This tutorial contains detailed instructions with useful integrated examples that help you understand the main features of FTK and how you can use it to analyze evidence. This book has clear and concise guidance in an easily accessible format.This tutorial-based guide is great for you if you want to conduct digital investigations with an integrated platform. Whether you are new to Computer Forensics or have some experience, this book will help you get started with FTK so you can analyze evidence effectively and efficiently. If you are a law enforcement official, corporate security, or IT profe

Science and the law are considered to be the two main shaping forces in modern societies. The Regional Seminars on Forensic Physics are organized by (mostly CNEA) scientists in Bariloche with a twofold purpose: to increase the participation of researchers as experts witnesses in the solution of legal problems, and to make judges aware of facilities and techniques that might prove useful. Some of the contributions to the last seminar are discussed, ranging from the numerical simulation of mayor explosions to the behavior of snow avalanches, and from the proper control of a trace laboratory to the distribution of words in the plays of Shakespeare. (author)

Full Text Available Disclosures about new financial frauds and scandals are continually appearing in the press.Â As a consequence, the accounting profession's traditional methods of monitoring corporate financial activities are under intense scrutiny.Â At the same time, there is recognition that principles-based GAAP from the International Accounting Standards Board will become the recognized standard in the U.S.Â The authors argue that these two factors will change the practices used to fight corporate malfeasance as investigators adapt the techniques of accounting into a forensic audit engagement model.

Full Text Available The recent laws on mental health define psychiatric illness as a loss of consciousness and understanding of consequences of self-behavioral acts, evaluated by loss of discernment. As discernment represents the main criteria of responsibility towards personal actions, this study attempts at presenting the ethical issues related to discernment evaluation from the perspective of forensic medicine. We propose a "mint" representation of the content and consequences of one’s own actions as a new criteria of evaluation, taking into account the modern principles of psychology and psychiatry.

of writing that might in fact come “after” testimony. In this paper I attempt to describe a mode of writing in contemporary literature on memory and history, which allows later generations to address historical events to which they did not bear witness, challenging the testimonial mode while bearing its...... strategies and strengths in mind - “after” in both senses of the word. The central argument is that just as the legal concept of testimony was introduced into the cultural sphere to describe a particular genre or mode of writing, the legal concept of forensics will serve as a useful term for describing...

Single-well seismology, Reverse Vertical Seismic Profiles (VSP`s) and Crosswell seismology are three new seismic techniques that we jointly refer to as borehole seismology. Borehole seismic techniques are of great interest because they can obtain much higher resolution images of oil and gas reservoirs than what is obtainable with currently used seismic techniques. The quality of oil and gas reservoir management decisions depend on the knowledge of both the large and the fine scale features in the reservoirs. Borehole seismology is capable of mapping reservoirs with an order of magnitude improvement in resolution compared with currently used technology. In borehole seismology we use a high frequency seismic source in an oil or gas well and record the signal in the same well, in other wells, or on the surface of the earth.

Our paper revisits Okun's relationship between observed unemployment rates and output gaps. We include in the relationship the effect of labour market institutions as well as age and gender effects. Our empirical analysis is based on 20 OECD countries over the period 1985-2013. We find that the

textabstractOur article revisits the Okun relationship between observed unemployment rates and output gaps. We include in the relationship the effect of labour market institutions as well as age and gender effects. Our empirical analysis is based on 20 OECD countries over the period 1985–2013. We

Bounded intention planning provides a pruning technique for optimal planning that has been proposed several years ago. In addition partial order reduction techniques based on stubborn sets have recently been investigated for this purpose. In this paper we revisit bounded intention planning in the view of stubborn sets.

This paper revisits a well-known hydrostatic paradox, observed when turning upside down a glass partially filled with water and covered with a sheet of light material. The phenomenon is studied in its most general form by including the mass of the cover. A historical survey of this experiment shows that a common misunderstanding of the phenomenon…

This paper is the second in a series revisiting the (effect of) Faraday rotation. We formulate and prove the thermodynamic limit for the transverse electric conductivity of Bloch electrons, as well as for the Verdet constant. The main mathematical tool is a regularized magnetic and geometric...

Nanoparticles appear in several areas of forensic science including security documents, paints, inks, and reagents that develop latent prints. One reagent (known as the silver physical developer) that visualizes the water insoluble components of latent print residue is based on the formation of highly charged silver nanoparticles. These attach to and grow on the residue and generate a silver image. Another such reagent involves highly charged gold nanoparticles. These attach to the residue forming a weak gold image which can be amplified with a silver physical developer. Nanoparaticles are also used in items such as paints, printing inks, and writing inks. Paints and most printing inks consist of nano-sized pigments in a vehicle. However, certain modern ink jet printing inks now contain nano-sized pigments to improve their light fastness and most gel inks are also based on nano scale pigments. These nanoparticlecontaining materials often appear as evidence and are thus subject to forensic characterization. Both luminescent (quantum dots), up-converting nano scale phosphors, and non luminescent nanoparticles are used as security tags to label product, add security to documents, and as anti counterfeiting measures. These assist in determining if an item is fraudulently made.

Nuclear Forensics is a growing field that is concerned with all stages of the process of creating and detonating a nuclear weapon. The main goal is to prevent nuclear attack by locating and securing nuclear material before it can be used in an aggressive manner. This stage of the process is mostly paperwork; laws, regulations, treaties, and declarations made by individual countries or by the UN Security Council. There is some preliminary leg work done in the form of field testing detection equipment and tracking down orphan materials; however, none of these have yielded any spectacular or useful results. In the event of a nuclear attack, the first step is to analyze the post detonation debris to aid in the identification of the responsible party. This aspect of the nuclear forensics process, while reactive in nature, is more scientific. A rock sample taken from the detonation site can be dissolved into liquid form and analyzed to determine its chemical composition. The chemical analysis of spent nuclear material can provide valuable information if properly processed and analyzed. In order to accurately evaluate the results, scientists require information on the natural occurring elements in the detonation zone. From this information, scientists can determine what percentage of the element originated in the bomb itself rather than the environment. To this end, element concentrations in soils from sixty-nine different cities are given, along with activity concentrations for uranium, thorium, potassium, and radium in various building materials. These data are used in the analysis program Python.

Nuclear Forensics is a growing field that is concerned with all stages of the process of creating and detonating a nuclear weapon. The main goal is to prevent nuclear attack by locating and securing nuclear material before it can be used in an aggressive manner. This stage of the process is mostly paperwork; laws, regulations, treaties, and declarations made by individual countries or by the UN Security Council. There is some preliminary leg work done in the form of field testing detection equipment and tracking down orphan materials; however, none of these have yielded any spectacular or useful results. In the event of a nuclear attack, the first step is to analyze the post detonation debris to aid in the identification of the responsible party. This aspect of the nuclear forensics process, while reactive in nature, is more scientific. A rock sample taken from the detonation site can be dissolved into liquid form and analyzed to determine its chemical composition. The chemical analysis of spent nuclear material can provide valuable information if properly processed and analyzed. In order to accurately evaluate the results, scientists require information on the natural occurring elements in the detonation zone. From this information, scientists can determine what percentage of the element originated in the bomb itself rather than the environment. To this end, element concentrations in soils from sixty-nine different cities are given, along with activity concentrations for uranium, thorium, potassium, and radium in various building materials. These data are used in the analysis program Python.

Stable isotopes are being used for forensic science studies, with applications to both natural and manufactured products. In this review we discuss how scientific evidence can be used in the legal context and where the scientific progress of hypothesis revisions can be in tension with the legal expectations of widely used methods for measurements. Although this review is written in the context of US law, many of the considerations of scientific reproducibility and acceptance of relevant scientific data span other legal systems that might apply different legal principles and therefore reach different conclusions. Stable isotopes are used in legal situations for comparing samples for authenticity or evidentiary considerations, in understanding trade patterns of illegal materials, and in understanding the origins of unknown decedents. Isotope evidence is particularly useful when considered in the broad framework of physiochemical processes and in recognizing regional to global patterns found in many materials, including foods and food products, drugs, and humans. Stable isotopes considered in the larger spatial context add an important dimension to forensic science.

We present a course in seismology that consists of a textbook with an accompanying web site (http://epscx.wustl.edu/seismology/book). The web site serves many different functions, and is of great importance as a companion to the curriculum in several different ways: (1) All of the more than 600 figures from the book are available on the web site. Geophysics is a very visually-oriented discipline, and many concepts are more easily taught with appropriate visual tools. In addition, many instructors are now using computer-based lecture programs such as PowerPoint. To aid in this, all of the figures are displayed in a common JPG format, both with and without titles. They are available to be used in a seismology course, or any kind of Earth Science course. This way, an instructor can easily grab a figure from the web site and drop it into a PowerPoint format. The figures are listed by number, but are also obtainable from menus of thumbnail sketches. If an instructor would like all of the figures, they can be obtained as large zip files, which can be unzipped after downloading. In addition, sample PowerPoint lectures using the figures as well the equations from the text will be available on the course web site. (2) Solutions to all of the homework problems are available in PDF format on the course website. Homework is a vital component of any quantitative course, but it is often a significant time commitment for instructors to derive all of the homework problems. In addition, it is much easier to select which homework problems are desired to be assigned if the solutions can be seen. The 64 pages of homework solutions are on a secure web site that requires a user ID and password that can be obtained from the authors. (3) Any errors found in the textbook are immediately posted on an "Errata" web page. Many of these errors are found by instructors who are using the curriculum (and they are given credit for finding the errors!). The text becomes an interactive process

Recent years witnessed the evolution of Python's ecosystem into one of the most powerful and productive scientific environments across disciplines. ObsPy (https://www.obspy.org) is a fully community driven, open-source project dedicated to provide a bridge for seismology into that ecosystem. It is a Python toolbox offering: Read and write support for essentially every commonly used data format in seismology with a unified interface and automatic format detection. This includes waveform data (MiniSEED, SAC, SEG-Y, Reftek, …) as well as station (SEED, StationXML, SC3ML, …) and event meta information (QuakeML, ZMAP, …). Integrated access to the largest data centers, web services, and real-time data streams (FDSNWS, ArcLink, SeedLink, ...). A powerful signal processing toolbox tuned to the specific needs of seismologists. Utility functionality like travel time calculations with the TauP method, geodetic functions, and data visualizations. ObsPy has been in constant development for more than eight years and is developed and used by scientists around the world with successful applications in all branches of seismology. Additionally it nowadays serves as the foundation for a large number of more specialized packages. Newest features include: Full interoperability of SEED and StationXML/Inventory objects Access to the Nominal Response Library (NRL) for easy and quick creation of station metadata from scratch Support for the IRIS Federated Catalog Service Improved performance of the EarthWorm client Several improvements to MiniSEED read/write module Improved plotting capabilities for PPSD (spectrograms, PSD of discrete frequencies over time, ..) Support for.. Reading ArcLink Inventory XML Reading Reftek data format Writing SeisComp3 ML (SC3ML) Writing StationTXT format This presentation will give a short overview of the capabilities of ObsPy and point out several representative or new use cases and show-case some projects that are based on ObsPy, e.g.: seismo

Nuclear terrorism has been identified as one of the most serious security threats facing the world today. Many countries, including the United States, have incorporated nuclear forensic analysis as a component of their strategy to prevent nuclear terrorism. Nuclear forensics involves the laboratory analysis of seized illicit nuclear materials or debris from a nuclear detonation to identify the origins of the material or weapon. Over the years, a number of forensic signatures have been developed to improve the confidence with which forensic analysts can draw conclusions. These signatures are validated and new signatures are discovered through research and development programs and in round-robin exercises among nuclear forensic laboratories. The recent Nuclear Smuggling International Technical Working Group Third Round Robin Exercise and an on-going program focused on attribution of uranium ore concentrate provide prime examples of the current state of nuclear forensics. These case studies will be examined and the opportunities for accelerator mass spectrometry to play a role in nuclear forensics will be discussed.

Nuclear terrorism has been identified as one of the most serious security threats facing the world today. Many countries, including the United States, have incorporated nuclear forensic analysis as a component of their strategy to prevent nuclear terrorism. Nuclear forensics involves the laboratory analysis of seized illicit nuclear materials or debris from a nuclear detonation to identify the origins of the material or weapon. Over the years, a number of forensic signatures have been developed to improve the confidence with which forensic analysts can draw conclusions. These signatures are validated and new signatures are discovered through research and development programs and in round-robin exercises among nuclear forensic laboratories. The recent Nuclear Smuggling International Technical Working Group Third Round Robin Exercise and an on-going program focused on attribution of uranium ore concentrate provide prime examples of the current state of nuclear forensics. These case studies will be examined and the opportunities for accelerator mass spectrometry to play a role in nuclear forensics will be discussed.

The work suggests a method for solving the direct and inverse problems of the engineer seismology by means of the structural approach of the systems theory. This approach gives a possibility for a simultaneous accounting of the two basic types of damping of the seismic signals in the earth foundation-geometrical damping and a damping in consequence of a dissipative energy loss. By the structural scheme an automatic account is made of the geometric damping of the signals. The damping from a dissipative energy loss on the other hand is accounted for through a choice of the type of frequency characteristics or the transmission functions of the different layers. With a few examples the advantages of the model including the two types of attenuation of the seismic signal are illustrated. An integral coefficient of damping is calculated which analogously to the frequency functions represents a generalized characteristic of is the whole earth foundation. (orig./HP)

Micro Electro-Mechanical Systems (MEMS) accelerometers are electromechanical devices able to measure static or dynamic accelerations. In the 1990s MEMS accelerometers revolutionized the automotive-airbag system industry and are currently widely used in laptops, game controllers and mobile phones. Nowadays MEMS accelerometers seems provide adequate sensitivity, noise level and dynamic range to be applicable to earthquake strong motion acquisition. The current use of 3 axes MEMS accelerometers in mobile phone maybe provide a new means to easy increase the number of observations when a strong earthquake occurs. However, before utilize the signals recorded by a mobile phone equipped with a 3 axes MEMS accelerometer for any scientific porpoise, it is fundamental to verify that the signal collected provide reliable records of ground motion. For this reason we have investigated the suitability of the iPhone 5 mobile phone (one of the most popular mobile phone in the world) for strong motion acquisition. It is provided by several MEMS devise like a three-axis gyroscope, a three-axis electronic compass and a the LIS331DLH three-axis accelerometer. The LIS331DLH sensor is a low-cost high performance three axes linear accelerometer, with 16 bit digital output, produced by STMicroelectronics Inc. We have tested the LIS331DLH MEMS accelerometer using a vibrating table and the EpiSensor FBA ES-T as reference sensor. In our experiments the reference sensor was rigidly co-mounted with the LIS331DHL MEMS sensor on the vibrating table. We assessment the MEMS accelerometer in the frequency range 0.2-20 Hz, typical range of interesting in strong motion seismology and earthquake engineering. We generate both constant and damped sine waves with central frequency starting from 0.2 Hz until 20 Hz with step of 0.2 Hz. For each frequency analyzed we generate sine waves with mean amplitude 50, 100, 200, 400, 800 and 1600 mg0. For damped sine waves we generate waveforms with initial amplitude

Essential reading for launching a career in computer forensicsInternet crime is on the rise, catapulting the need for computer forensics specialists. This new edition presents you with a completely updated overview of the basic skills that are required as a computer forensics professional. The author team of technology security veterans introduces the latest software and tools that exist and they review the available certifications in this growing segment of IT that can help take your career to a new level. A variety of real-world practices take you behind the scenes to look at the root causes

Forensic entomotoxicology is a branch of forensic medicine, which applies entomology, toxicology and other related studies to solve the poisoning cases. It has an obvious advantage in the investigation on poisoning death. Based on the expounding definition and research of entomotoxicology, this paper reviews research progress and application value in some aspects of forensic medicine, such as the effects of drugs/toxins on the growth and development of sarcosaphagous insects and the qualitative and quantitative analysis of the drugs/toxins in the poisoned body tissue.

Focuses on criminalistics, which can be understood to mean the activities and specialty areas characteristic of most municipal, county, or state forensic science laboratories in the United States. (DDR)

Full Text Available A biosensor is a device that uses biological materials to detect and monitor the presence of specific chemicals in an area. Traditional methods of volatile detection used by law enforcement agencies and rescue teams typically consist of reliance on canine olfaction. This concept of using dogs to detect specific substances is quite old. However, dogs have some limitations such as cost of training and time of conditioning. Thus, the possibility of using other organisms as biosensors including rats, dolphins, honeybees, and parasitic wasps for detecting explosives, narcotics and cadavers has been developed. Insects have several advantages unshared by mammals. Insects are sensitive, cheap to produce and can be conditioned with impressive speed for a specific chemical-detection task. Moreover, insects might be a preferred sensing method in scenarios that are deemed too dangerous to use mammals. The purpose of this review is to provide an overview of the biosensors used in forensic sciences.

This paper outlines the development and convergence of forensic science and secure psychiatric services in the UK, locating the professionalization of forensic nursing within a complex web of political, economic, and ideological structures. It is suggested that a stagnation of the therapeutic enterprise in high and medium security provision has witnessed an intrusion of medical power into the societal body. Expanding technologies of control and surveillance are discussed in relation to the move from modernity to postmodernity and the ongoing dynamic of medicalized offending. Four aspects of globalization are identified as impacting upon the organization and application of forensic practice: (i) organized capitalism and the exhaustion of the welfare state; (ii) security versus danger and trust versus risk; (iii) science as a meta-language; and (iv) foreclosure as a mechanism of censorship. Finally, as a challenge for the profession, some predictions are offered about the future directions or demise of forensic nursing.

Forensic science is an approach to study desirability of specific technologies in the context of value objectives and biological imperatives of society. Such groups should be formed with people from various physical and social sciences. (PS)

As issues of professional standards and error rates continue to be addressed in the courts, forensic anthropologists should be proactive by developing and adhering to professional standards of best practice. There has been recent increased awareness and interest in critically assessing some of the techniques used by forensic anthropologists, but issues such as validation, error rates, and professional standards have seldom been addressed. Here we explore the legal impetus for this trend and identify areas where we can improve regarding these issues. We also discuss the recent formation of a Scientific Working Group for Forensic Anthropology (SWGANTH), which was created with the purposes of encouraging discourse among anthropologists and developing and disseminating consensus guidelines for the practice of forensic anthropology. We believe it is possible and advisable for anthropologists to seek and espouse research and methodological techniques that meet higher standards to ensure quality and consistency in our field.

measurements taken from computed tomography (CT) scans. Previous reports have observed that the lateral angle size in females is significantly larger than in males. The method was applied to an independent series of 77 postmortem CT scans (42 males, 35 females) to validate its accuracy and reliability...... method appears to be of minimal practical use in forensic anthropology and archeology....

Parasites show a great potential to Forensic Science. Forensic Science is the application of any science and methodology to the legal system. The forensic scientist collects and analyses the physical evidence and produce a report of the results to the court. A parasite is an organism that lives at the expense of another and they exist in any ecosystem. Parasites are the cause of many important diseases. The forensic scientists can use the parasites to identify a crime scene, to determine the murder weapon or simply identify an individual. The applications for parasites in the Forensic Science can be many and more studies should be made in Forensic Parasitology. The most important parasites in Forensic Science are helminths specifically schistosomes. Through history there are many cases where schistosomes were described in autopsies and it was related to the cause of death. Here we review the applications of parasites in Forensic Science and its importance to the forensic scientist.

Digital multimedia such as images and videos are prevalent on today's internet and cause significant social impact, which can be evidenced by the proliferation of social networking sites with user generated contents. Due to the ease of generating and modifying images and videos, it is critical to establish trustworthiness for online multimedia information. In this paper, we propose novel approaches to perform multimedia forensics using compact side information to reconstruct the processing history of a document. We refer to this as FASHION, standing for Forensic hASH for informatION assurance. Based on the Radon transform and scale space theory, the proposed forensic hash is compact and can effectively estimate the parameters of geometric transforms and detect local tampering that an image may have undergone. Forensic hash is designed to answer a broader range of questions regarding the processing history of multimedia data than the simple binary decision from traditional robust image hashing, and also offers more efficient and accurate forensic analysis than multimedia forensic techniques that do not use any side information.

In this article we revisit, with the help of images, those classic signs in chest radiography described by Dr Benjamin Felson himself, or other illustrious radiologists of his time, cited and discussed in 'Chest Roentgenology'. We briefly describe the causes of the signs, their utility and the differential diagnosis to be considered when each sign is seen. Wherever possible, we use CT images to illustrate the basis of some of these classic radiographic signs.

In this paper we revisit our joint work with Antonio Siconolfi on time functions. We will give a brief introduction to the subject. We will then show how to construct a Lipschitz time function in a simplified setting. We will end with a new result showing that the Aubry set is not an artifact of our proof of existence of time functions for stably causal manifolds.

It has been 15 years since the original presentation by Frank Halasz at Hypertext'87 on seven issues for the next generation of hypertext systems. These issues are: Search and Query Composites Virtual Structures Computation in/over hypertext network Versioning Collaborative Work Extensibility and Tailorability Since that time, these issues have formed the nucleus of multiple research agendas within the Hypertext community. Befitting this direction-setting role, the issues have been revisited ...

We revisit the deterministic graphical games of Washburn. A deterministic graphical game can be described as a simple stochastic game (a notion due to Anne Condon), except that we allow arbitrary real payoffs but disallow moves of chance. We study the complexity of solving deterministic graphical...... games and obtain an almost-linear time comparison-based algorithm for computing an equilibrium of such a game. The existence of a linear time comparison-based algorithm remains an open problem....

The GEOFON data centre and others in the seismological community have been archiving seismic waveforms for many years. The amount of seismic data available continuously increases due to the use of higher sampling rates and the growing number of stations. In recent years, there is a trend towards standardization of the protocols and formats to improve and homogenise access to these data [FDSN, 2013]. The seismological community has begun assigning a particular persistent identifier (PID), the Digital Object Identifier (DOI), to seismic networks as a first step for properly and consistently attributing the use of data from seismic networks in scientific articles [Evans et al., 2015]. This was codified in a recommendation by the international Federation of Digital Seismic Networks [FDSN, 2014]; DOIs for networks now appear in community web pages. However, our community, in common with other fields of science, still struggles with issues such as: supporting reproducibility of results; providing proper attribution (data citation) for data sets; and measuring the impact (by tracking their use) of, those data sets. Seismological data sets used for research are frequently created "on-the-fly" based on particular user requirements such as location or time period; users prepare requests to select subsets of the data held in seismic networks; the data actually provided may even be held at many different data centres [EIDA, 2016]. These subsets also require careful citation. For persistency, a request must receive exactly the same data when repeated at a later time. However, if data are curated between requests, the data set delivered may differ, severely complicating the ability to reproduce a result. Transmission problems or configuration problems may also inadvertently modify the response to a request. With this in mind, our next step is the assignment of additional EPIC-PIDs to daily data files (currently over 28 million in the GEOFON archive) for use within the data

In the last five year services and data providers, within the seismological community in Europe, focused their efforts in migrating the way of opening their archives towards a Service Oriented Architecture (SOA). This process tries to follow pragmatically the technological trends and available solutions aiming at effectively improving all the data stewardship activities. These advancements are possible thanks to the cooperation and the follow-ups of several EC infrastructural projects that, by looking at general purpose techniques, combine their developments envisioning a multidisciplinary platform for the earth observation as the final common objective (EPOS, Earth Plate Observation System) One of the first results of this effort is the Earthquake Data Portal (http://www.seismicportal.eu), which provides a collection of tools to discover, visualize and access a variety of seismological data sets like seismic waveform, accelerometric data, earthquake catalogs and parameters. The Portal offers a cohesive distributed search environment, linking data search and access across multiple data providers through interactive web-services, map-based tools and diverse command-line clients. Our work continues under other EU FP7 projects. Here we will address initiatives in two of those projects. The NERA, (Network of European Research Infrastructures for Earthquake Risk Assessment and Mitigation) project will implement a Common Services Architecture based on OGC services APIs, in order to provide Resource-Oriented common interfaces across the data access and processing services. This will improve interoperability between tools and across projects, enabling the development of higher-level applications that can uniformly access the data and processing services of all participants. This effort will be conducted jointly with the VERCE project (Virtual Earthquake and Seismology Research Community for Europe). VERCE aims to enable seismologists to exploit the wealth of seismic data

In August, 2009 I created a Facebook “fan” page for the Seismological Society of America. We had been exploring cost-effective options for providing forums for two-way communication for some months. We knew that a number of larger technical societies had invested significant sums of money to create customized social networking sites but that a small society would need to use existing low-cost software options. The first thing I discovered when I began to set up the fan page was that an unofficial SSA Facebook group already existed, established by Steven J. Gibbons, a member in Norway. Steven had done an excellent job of posting material about SSA. Partly because of the existing group, the official SSA fan page gained fans rapidly. We began by posting information about our own activities and then added links to activities in the broader geoscience community. While much of this material also appeared on our website and in our publication, Seismological Research Letters (SRL), the tone on the FB page is different. It is less formal with more emphasis on photos and links to other sites, including our own. Fans who are active on FB see the posts as part of their social network and do not need to take the initiative to go to the SSA site. Although the goal was to provide a forum for two-way communication, our initial experience was that people were clearly reading the page but not contributing content. This appears to be case with fan pages of sister geoscience societies. FB offers some demographic information to fan site administrators. In an initial review of the demographics it appeared that fans were younger than the overall demographics of the Society. It appeared that a few of the fans are not members or even scientists. Open questions are: what content will be most useful to fans? How will the existence of the page benefit the membership as a whole? Will the page ultimately encourage two-way communication as hoped? Web 2.0 is generating a series of new

Lithospheric deformation in tectonically active regions depends on the 3D distribution of rheology, which is in turn critically controlled by temperature. Under the auspices of the Southern California Earthquake Center (SCEC) we are developing a 3D Community Thermal Model (CTM) to constrain rheology and so better understand deformation processes within this complex but densely monitored and relatively well-understood region. The San Andreas transform system has sliced southern California into distinct blocks, each with characteristic lithologies, seismic velocities and thermal structures. Guided by the geometry of these blocks we use more than 250 surface heat-flow measurements to define 13 geographically distinct heat flow regions (HFRs). Model geotherms within each HFR are constrained by averages and variances of surface heat flow q0 and the 1D depth distribution of thermal conductivity (k) and radiogenic heat production (A), which are strongly dependent on rock type. Crustal lithologies are not always well known and we turn to seismic imaging for help. We interrogate the SCEC Community Velocity Model (CVM) to determine averages and variances of Vp, Vs and Vp/Vs versus depth within each HFR. We bound (A, k) versus depth by relying on empirical relations between seismic wave speed and rock type and laboratory and modeling methods relating (A, k) to rock type. Many 1D conductive geotherms for each HFR are allowed by the variances in surface heat flow and subsurface (A, k). An additional constraint on the lithosphere temperature field is provided by comparing lithosphere-asthenosphere boundary (LAB) depths identified seismologically with those defined thermally as the depth of onset of partial melting. Receiver function studies in Southern California indicate LAB depths that range from 40 km to 90 km. Shallow LAB depths are correlated with high surface heat flow and deep LAB with low heat flow. The much-restricted families of geotherms that intersect peridotite

The study prepared for the nuclear power plants to be located at Itaorna comprised, the analysis and integration of Geologic, tectonic, geomorphologic and seismologic information and satisfactory results of regional stability were obtained. (L.H.L.L.) [pt

Full Text Available of earthquakes, earthquake hazard and earth structure in South Africa was prepared for the centennial handbook of the Interna- tional Association of Seismology and the Physics of the Earth?s Interior(IASPEI).3 Referencestothesescompletedinthelastfour...

BGR seismologists often set up monitoring stations for testing purposes. The engineers from the Central Seismological Observatory have now developed a new type of mobile monitoring station which can be remotely controlled.

For the past several years a numerous studies in the field of forensic psychiatry confirmed a close relationship between violent offenders and comorbid substance abuse. The comorbid substance abuse in violent offenders was usually unrecognized and misdiagnosed. Furthermore, comorbidity in forensic psychiatry describes the co-occurrence of two or more conditions or psychiatric disorder known in the literature as dual diagnosis and defined by World Health Organization (WHO). In fact, many violent offenders have multiple psychiatric diagnoses. Recent studies have confirmed causal relationship between major psychiatric disorders and concomitant substance abuse (comorbidity) in 50-80% of forensic cases. In general, there is a high level of psychiatric comorbidity in forensic patients with prevalence of personality disorders (50-90%), mood disorders (20-60%) and psychotic disorders (15-20%) coupled with substance abuse disorders. Moreover, the high prevalence of psychiatric comorbidities could be found in mentally retarded individuals, as well as, in epileptic patients. Drugs and alcohol abuse can produce serious psychotoxic effects that may lead to extreme violent behavior and consequently to serious criminal offence such as physical assault, rape, armed robbery, attempted murder and homicide, all due to an altered brain function and generating psychotic-like symptoms. Studies have confirmed a significant statistical relevance in causal relationship between substance abuse and violent offences. In terms of forensic psychiatry, the comorbidity strongly contributes in the process of establishing psychiatric diagnosis of diminished mental capacity or insanity at the time of the offence in the course of clinical assessment and evaluation of violent offenders. Today, the primary focus of forensic psychiatry treatment services (in-patient or community) is management of the violent offenders with psychiatric comorbidity which requires a multilevel, evidence based approach to

QuakeML is an XML-based data exchange standard for seismology that is in its fourth year of active community-driven development. Its development was motivated by the need to consolidate existing data formats for applications in statistical seismology, as well as setting a cutting-edge, community-agreed standard to foster interoperability of distributed infrastructures. The current release (version 1.2) is based on a public Request for Comments process and accounts for suggestions and comments...

Urban seismology has become an active research field in the recent years, both with seismological objectives, as obtaining better microzonation maps in highly populated areas, and with engineering objectives, as the monitoring of traffic or the surveying of historical buildings. We analyze here the seismic records obtained by a broad-band seismic station installed in the ICTJA-CSIC institute, located near the center of Barcelona city. Although this station was installed to introdu...

Full Text Available In Colombia are available a discreet number of historical seismology investigations, dating back 50 years. This paper reviews basic information about earthquakes studies in Colombia, such as primary sources, compilation of descriptive catalogues and parametric catalogues. Father Jesús Emilio Ramírez made the main systematic study before 1975. During the last 20 years, great earthquakes hit Colombia and, as consequence, historical seismology investigation was developed in the frame of seismic hazard projects.

The physics of falling from a height, a topic that could be included in a course on forensic physics or in an undergraduate class as an example of Newton's laws, is applied to a common forensic problem.

Nuclear forensics assists in responding to any event where nuclear material is found outside of regulatory control; a response plan is presented and a nuclear forensics program is undergoing further development so that smugglers are sufficiently deterred.

Network traffic capture is an integral part of network forensics, but current traffic capture techniques are typically passive in nature. Under heavy loads, it is possible for a sniffer to miss packets, which affects the quality of forensic evidence.

This study finds that, in most cases analyzed to date, past seismicity tends to delineate zones where future earthquakes are likely to occur. Network seismicity catalogs for the New Madrid Seismic Zone (NMSZ), Australia (AUS), California (CA), and Alaska (AK) are analyzed using modified versions of the Cellular Seismology (CS) method of Kafka (2002, 2007). The percentage of later occurring earthquakes located near earlier occurring earthquakes typically exceeds the expected percentage for randomly distributed later occurring earthquakes, and the specific percentage is influenced by several variables, including magnitude, depth, time, and tectonic setting. At 33% map area coverage, hit percents are typically 85-95% in the NMSZ, 50-60% in AUS, 75-85% in CA, and 75-85% in AK. Statistical significance testing is performed on trials analyzing the same variables so that the overall regions can be compared, although some tests are inconclusive due to the small number of earthquake sample sizes. These results offer useful insights into understanding the capabilities and limits of CS studies, which can provide guidance for improving the seismicity-based components of seismic hazard assessments.

Permanent ocean-bottom cables installed at the Valhall field can repeatedly record high quality active seismic surveys. But in the absence of active seismic shooting, passive data can be recorded and streamed to the platform in real time. Here I studied 29 hours of data using seismic interferometry. I generate omni-directional Scholte-wave virtual-sources at frequencies considered very-low in the exploration seismology community (0.4-1.75 Hz). Scholte-wave group arrival times are inverted using both eikonal tomography and straight-ray tomography. The top 100 m near-surface at Valhall contains buried channels about 100 m wide that have been imaged with active seismic. Images obtained by ASNT using eikonal tomography or straight-ray tomography both contain anomalies that match these channels. When continuous recordings are made in real-time, tomography images of the shallow subsurface can be formed or updated on a daily basis, forming a very low cost near-surface monitoring system using seismic noise.

Jupiter's internal structure is poorly known (Guillot et al. 2004). Seismology is a powerful tool to investigate the internal structure of planets and stars, by analyzing how acoustic waves propagate. Mosser (1997) and Gudkova & Zarkhov (1999) showed that the detection and the identification of non-radial modes up to degree ℓ=25 can constrain strongly the internal structure. SYMPA is a ground-based network project dedicated to the Jovian oscillations (Schmider et al. 2002). The instrument is composed of a Mach-Zehnder interferometer producing four interferograms of the planetary spectrum. The combination of the four images in phase quadrature allows the reconstruction of the incident light phase, which is related to the Doppler shift generated by the oscillations. Two SYMPA instruments were built at the Nice university and were used simultaneously during two observation campaigns, in 2004 and 2005, at the San Pedro Martir observatory (Mexico) and the Teide observatory (Las Canarias). We will present for the first time the data processing and the preliminary results of the experiment.

The author dedicates this book to readers who are concerned with finding out the status of concepts, statements and hypotheses, and with clarifying and rearranging them in a logical order. It is thus not intended to teach tools and techniques of the trade, but to discuss the foundations on which seismology — and in a larger sense, the theory of wave propagation in solids — is built. A key question is: why and to what degree can a theory developed for an elastic continuum be used to investigate the propagation of waves in the Earth, which is neither a continuum nor fully elastic. But the scrutiny of the foundations goes much deeper: material symmetry, effective tensors, equivalent media; the influence (or, rather, the lack thereof) of gravitational and thermal effects and the rotation of the Earth, are discussed ab initio. The variational principles of Fermat and Hamilton and their consequences for the propagation of elastic waves, causality, Noether's theorem and its consequences on conservation of energy...

Egalitarianism and justice are amongst the core attributes of a democratic regime and should be also secured in an e-democratic setting. As such, the rise of computer related offenses pose a threat to the fundamental aspects of e-democracy and e-governance. Digital forensics are a key component for protecting and enabling the underlying (e-)democratic values and therefore forensic readiness should be considered in an e-democratic setting. This position paper commences from the observation that the density of compliance and potential litigation activities is monotonically increasing in modern organizations, as rules, legislative regulations and policies are being constantly added to the corporate environment. Forensic practices seem to be departing from the niche of law enforcement and are becoming a business function and infrastructural component, posing new challenges to the security professionals. Having no a priori knowledge on whether a security related event or corporate policy violation will lead to litigation, we advocate that computer forensics need to be applied to all investigatory, monitoring and auditing activities. This would result into an inflation of the responsibilities of the Information Security Officer. After exploring some commonalities and differences between IS audit and computer forensics, we present a list of strategic challenges the organization and, in effect, the IS security and audit practitioner will face.

Compared to other sciences, computer forensics (digital forensics) is a relatively young discipline. It was established in 1999 and it has been an irreplaceable tool in sanctioning cybercrime ever since. Good knowledge of computer forensics can be really helpful in uncovering a committed crime. Not adhering to the methodology of computer forensics, however, makes the obtained evidence invalid/irrelevant and as such it cannot be used in legal proceedings. This paper is to explain the methodolo...

Objectives This paper provides an overview for general and forensic psychiatrists of the complexity and challenge of working in the civil medico-legal arena. It covers expert evidence, ethics, core concepts in civil forensic psychiatry and report writing. Conclusions Civil forensic psychiatry is an important sub-speciality component of forensic psychiatry that requires specific skills, knowledge and the ability to assist legal bodies in determining the significance of psychiatric issues.

and dental practitioners of the crucial role of dentist in victim's identification and ... role of forensic dental personnel in human identification following ... matrimonial, or financial reasons6. The first and .... chief physician during the systematic extermination of the Jews at ... of police officers with forensic pathologist and forensic.

Digital evidence is playing an increasingly important role in prosecuting crimes. The reasons are manifold: financially lucrative targets are now connected online, systems are so complex that vulnerabilities abound and strong digital identities are being adopted, making audit trails more useful. If the discoveries of forensic analysts are to hold up to scrutiny in court, they must meet the standard for scientific evidence. Software systems are currently developed without consideration of this fact. This paper argues for the development of a formal framework for constructing “digital artifacts” that can serve as proxies for physical evidence; a system so imbued would facilitate sound digital forensic inference. A case study involving a filesystem augmentation that provides transparent support for forensic inference is described.

Forensic odontology is the subdiscipline of dentistry which analyses dental evidence in the interest of justice. Oral pathology is the subdiscipline of dentistry that deals with the pathology affecting the oral and maxillofacial regions. This subdiscipline is utilized for identification through oral and maxillofacial pathologies with associated syndromes, enamel rod patterns, sex determination using exfoliative cytology, identification from occlusal morphology of teeth, and deoxyribonucleic acid profiling from teeth. This subdiscipline is also utilized for age estimation studies which include Gustafson's method, incremental lines of Retzius, perikymata, natal line formation in teeth, neonatal line, racemization of collagen in dentin, cemental incremental lines, thickness of the cementum, and translucency of dentin. Even though the expertise of an oral pathologist is not taken in forensic investigations, this paper aims to discuss the role of oral pathology in forensic investigation.

Starting from Zermelo’s classical formal treatment of chess, we trace through history the analysis of two-player win/lose/draw games with perfect information and potentially infinite play. Such chess-like games have appeared in many different research communities, and methods for solving them......, such as retrograde analysis, have been rediscovered independently. We then revisit Washburn’s deterministic graphical games (DGGs), a natural generalization of chess-like games to arbitrary zero-sum payoffs. We study the complexity of solving DGGs and obtain an almost-linear time comparison-based algorithm...

We revisit the bottomonium spectrum motivated by the recently exciting experimental progress in the observation of new bottomonium states, both conventional and unconventional. Our framework is a nonrelativistic constituent quark model which has been applied to a wide range of hadronic observables from the light to the heavy quark sector and thus the model parameters are completely constrained. Beyond the spectrum, we provide a large number of electromagnetic, strong and hadronic decays in order to discuss the quark content of the bottomonium states and give more insights about the better way to determine their properties experimentally.

We revisited the brachiopod fold hypothesis and investigated metamorphosis in the craniiform brachiopod Novocrania anomala. Larval development is lecithotrophic and the dorsal (brachial) valve is secreted by dorsal epithelia. We found that the juvenile ventral valve, which consists only of a thin...... brachiopods during metamorphosis to cement their pedicle to the substrate. N. anomala is therefore not initially attached by a valve but by material corresponding to pedicle cuticle. This is different to previous descriptions, which had led to speculations about a folding event in the evolution of Brachiopoda...

With the increase in the number of digital crimes and in their sophistication, High Performance Computing (HPC) is becoming a must in Digital Forensics (DF). According to the FBI annual report, the size of data processed during the 2010 fiscal year reached 3,086 TB (compared to 2,334 TB in 2009) and the number of agencies that requested Regional Computer Forensics Laboratory assistance increasing from 689 in 2009 to 722 in 2010. Since most investigation tools are both I/O and CPU bound, the next-generation DF tools are required to be distributed and offer HPC capabilities. The need for HPC is even more evident in investigating crimes on clouds or when proactive DF analysis and on-site investigation, requiring semi-real time processing, are performed. Although overcoming the performance challenge is a major goal in DF, as far as we know, there is almost no research on HPC-DF except for few papers. As such, in this work, we extend our work on the need of a proactive system and present a high performance automated proactive digital forensic system. The most expensive phase of the system, namely proactive analysis and detection, uses a parallel extension of the iterative z algorithm. It also implements new parallel information-based outlier detection algorithms to proactively and forensically handle suspicious activities. To analyse a large number of targets and events and continuously do so (to capture the dynamics of the system), we rely on a multi-resolution approach to explore the digital forensic space. Data set from the Honeynet Forensic Challenge in 2001 is used to evaluate the system from DF and HPC perspectives.

Approximately 80 percent of the world's population now owns a cell phone, which can hold evidence or contain logs about communications concerning a crime. Cameras, PDAs, and GPS devices can also contain information related to corporate policy infractions and crimes. Aimed to prepare investigators in the public and private sectors, Digital Forensics for Handheld Devices examines both the theoretical and practical aspects of investigating handheld digital devices. This book touches on all areas of mobile device forensics, including topics from the legal, technical, academic, and social aspects o

Full Text Available Animal abuse is important social issue, which includes a wide range of behaviors of humans that are harmful to animals, starting from unintentional neglect to intentional cruelty. Types of animal abuse are different and they can include physical, sexual, emotional abuse or neglect. Training dogs for fights and dog fighting are considered to be neglection of animals. Forensic veterinarians are called for testifining more often now for presenting the evidence that can lead to making a case regarding animal abuse. This study will include an explanation of forensic vet's role and different types of animal abuse.

This paper discusses the nature of four waves of technological innovations in forensic genetics alongside the social, legal and ethical aspect of these innovations. It emphasises the way in which technological advances and their socio-legal frameworks are co-produced, shaping technology...... expectations, social identities, and legal institutions. It also considers how imagined and actual uses of forensic genetic technologies are entangled with assertions about social order, affirmations of common values and civil rights, and promises about security and justice. Our comments seek to encourage...

From 1997 on, when the first "Jornadas Venezolanas de Sismicidad Historica" took place, a big interest awoke in Venezuela to organize the available information related to historic earthquakes. At that moment only existed one published historic earthquake catalogue, that from Centeno Grau published the first time in 1949. That catalogue had no references about the sources of information. Other catalogues existed but they were internal reports for the petroleum companies and therefore difficult to access. In 2000 Grases et al reedited the Centeno-Grau catalogue, it ended up in a new, very complete catalogue with all the sources well referenced and updated. The next step to organize historic seismicity data was, from 2004 to 2008, the creation of the STSHV (Sistema de teleinformacion de Sismologia Historica Venezolana, http://sismicidad.hacer.ula.ve ). The idea was to bring together all information about destructive historic earthquakes in Venezuela in one place in the internet so it could be accessed easily by a widespread public. There are two ways to access the system. The first one, selecting an earthquake or a list of earthquakes, and the second one, selecting an information source or a list of sources. For each earthquake there is a summary of general information and additional materials: a list with the source parameters published by different authors, a list with intensities assessed by different authors, a list of information sources, a short text summarizing the historic situation at the time of the earthquake and a list of pictures if available. There are searching facilities for the seismic events and dynamic maps can be created. The information sources are classified in: books, handwritten documents, transcription of handwritten documents, documents published in books, journals and congress memories, newspapers, seismologic catalogues and electronic sources. There are facilities to find specific documents or lists of documents with common characteristics

An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

The goal of this EU funded project is to develop means and tools to produce a homogeneous European-Mediterranean seismic bulletin that could serve as a reference. The 3 main objectives are 1) the definition of a unified magnitude scale for M > 3, 2) an improved location of events especially in border regions, 3) Improving rapid and regular data exchange within the European-Mediterranean region. The first step is to define an homogeneous and accurate magnitude estimation for the whole region of interest. Experience shows that the differences in the magnitudes reported by several institutes for a given event may vary up to 1.5 units. Three different magnitude computations are applied on a reference data set of well known events: a Lg waves coda magnitude, a Richter local magnitude and a moment magnitude scale. The comparison of the results is currently carried out. The algorithm associated to the selected magnitude will be implemented locally on a set of stations. New velocity models for border regions are developed from the analysis of the residuals of events recorded by permanent and temporary networks. The robustness and reliability of the 3D models versus 1D model have been evaluated. EMSC gathers via e-mail manually picked seismic phase arrival times with or without associated locations from about 50 seismological institutes of the European- Mediterranean region in a database. These bulletins are automatically merged by unique software. The number of processed events is about 2000 / month and should grow significantly with larger input from Middle East and Northern Africa. Events are then submitted to an automatic analysis of the location reliability, and, for dubious events, to a manual reprocessing. In order to improve data exchange, the installation of autoDRM systems is promoted. (authors)

Fracturing and changes in the englacial macroscopic water content change the elastic bulk properties of ice bodies. Small seismic velocity variations, resulting from such changes, can be measured using a technique called coda-wave interferometry. Here, coda refers to the later-arriving, multiply scattered waves. Often, this technique is applied to so-called virtual-source responses, which can be obtained using seismic interferometry (a simple crosscorrelation process). Compared to other media (e.g., the Earth's crust), however, ice bodies exhibit relatively little scattering. This complicates the application of coda-wave interferometry to the retrieved virtual-source responses. In this work, we therefore investigate the applicability of coda-wave interferometry to virtual-source responses obtained using two alternative seismic interferometric techniques, namely, seismic interferometry by multidimensional deconvolution (SI by MDD), and virtual-reflector seismology (VRS). To that end, we use synthetic data, as well as active-source glacier data acquired on Glacier de la Plaine Morte, Switzerland. Both SI by MDD and VRS allow the retrieval of more accurate virtual-source responses. In particular, the dependence of the retrieved virtual-source responses on the illumination pattern is reduced. We find that this results in more accurate glacial phase-velocity estimates. In addition, VRS introduces virtual reflections from a receiver contour (partly) enclosing the medium of interest. By acting as a sort of virtual reverberation, the coda resulting from the application of VRS significantly increases seismic monitoring capabilities, in particular in cases where natural scattering coda is not available.

Analysis of GPS measurements with a controlled laboratory system, built to simulate the ground motions caused by tectonic earthquakes and other transient geophysical signals such as glacial earthquakes, enables us to assess the technique of high-rate GPS. The root-mean-square (rms) position error of this system when undergoing realistic simulated seismic motions is 0.05~mm, with maximum position errors of 0.1~mm, thus providing "ground truth" GPS displacements. We have acquired an extensive set of high-rate GPS measurements while inducing seismic motions on a GPS antenna mounted on this system with a temporal spectrum similar to real seismic events. We found that, for a particular 15-min-long test event, the rms error of the 1-Hz GPS position estimates was 2.5~mm, with maximum position errors of 10~mm, and the error spectrum of the GPS estimates was approximately flicker noise. These results may however represent a best-case scenario since they were obtained over a short (~10~m) baseline, thereby greatly mitigating baseline-dependent errors, and when the number and distribution of satellites on the sky was good. For example, we have determined that the rms error can increase by a factor of 2--3 as the GPS constellation changes throughout the day, with an average value of 3.5~mm for eight identical, hourly-spaced, consecutive test events. The rms error also increases with increasing baseline, as one would expect, with an average rms error for a ~1400~km baseline of 9~mm. We will present an assessment of the accuracy of high-rate GPS based on these measurements, discuss the implications of this study for seismology, and describe new applications in glaciology.

Full Text Available Cloud computing is a novel computing paradigm that presents new research opportunities in the field of digital forensics. Cloud computing is based on the following principles: on-demand self-service, broad network access, resource pooling, rapid...

Forensic entomology is a branch of forensic medicine, which applies studies of insects and arthropods to getting evidence for court and has an analogous advantage in the estimation of the postmortem interval (PMI) and other questions of forensic relevance. The paper expounds its definition and contents and reviews some progress of the studies in some aspects in China such as the constitution and succession of insect community on the different cadavers, the applications of morphological features of insects and the technology of analysis of deoxyribonucleic acid (DNA) in forensic entomology, and forensic entomological toxicology etc.

Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas. Real-time seismology systems are not o...

Aug 18, 2016 ... health care research, it is therefore pertinent to revisit the state of nursing research in the country. .... platforms, updated libraries with electronic resource ... benchmarks for developing countries of 26%, [17] the amount is still ...

Full Text Available The Arab Journal of Forensic Sciences and Forensic Medicine (AJFSFM is a peer-reviewed, open access (CC BY-NC, international journal for publishing original contributions in various fields of forensic science. These fields include, but are not limited to forensic pathology and histochemistry, toxicology(drugs, alcohol, etc., forensic biology (serology, human DNA profiling, entomology, population genetics, forensic chemistry(inks, paints, dyes, explosives, fire accelerants, psychiatry and hypnotics, forensic anthropology and archeology, forensic odontology, fingerprints and impressions, firearms and tool marks, white collar crimes (counterfeit and forgery; questioned documents, digital forensics; cyber-crimes, criminal justice and crime scene investigation, as well as many other disciplines where science and medicine interact with the law.

Full Text Available Computer Crime and computer related incidents continue their prevalence and frequency and result in loss of billions of dollars. To fight against those crimes and frauds, it is urgent to develop digital forensics education programs to train a suitable workforce to efficiently and effectively investigate crimes and frauds. However, there is no standard to guide the design of digital forensics curriculum for an academic program. In this research, we investigate the research works on digital forensics curriculum design and existing education programs.Â Both digital forensics educators and practitioners were surveyed and the results are analyzed to determine what industry and law enforcement need. Based on the survey results and what the industry certificate programs cover, we identified topics that are desired to be covered in digital forensics courses. Finally, we propose six digital forensics courses and their topics that can be offered in both undergraduate and graduate digital forensics programs.

of America, and the Netherlands Forensic Institute. During this meeting, an international and multidisciplinary panel of forensic scientists discussed the current state of science in forensic radiology, and drafted a research agenda to further advance the field. Four groups for further research focus were...... identified: big data and statistics, identification and biological profiling, multimodal imaging, and visualization and presentation. This paper describes each of these research topics and thereby hopes to contribute to the development of this exciting new field of forensic medical science.......This paper presents the outcome of the first international forensic radiology and imaging research summit, organized by the International Society of Forensic Radiology and Imaging, the International Association of Forensic Radiographers, the National Institute of Justice of the United States...

textabstractDNA analysis is frequently used to acquire information from biological material to aid enquiries associated with criminal offences, disaster victim identification and missing persons investigations. As the relevance and value of DNA profiling to forensic investigations has increased, so

This article outlines how to incorporate argumentation into a forensic science unit using a mock trial. Practical details of the mock trial include: (1) a method of scaffolding students' development of their argument for the trial, (2) a clearly outlined set of expectations for students during the planning and implementation of the mock…

This activity introduces the science of "forensic palynology": the use of microscopic pollen and spores (also called "palynomorphs") to solve criminal cases. Plants produce large amounts of pollen or spores during reproductive cycles. Because of their chemical resistance, small size, and morphology, pollen and spores can be…

Since the 1980s, advances in DNA technology have revolutionized the scope and practice of forensic medicine. From the days of restriction fragment length polymorphisms (RFLPs) to short tandem repeats (STRs), the current focus is on the next generation genome sequencing. It has been almost a decad...

In this report we consider the following question: does a forensic expert need to know exactly how the evidential material was selected? We set up a few simple models of situations in which the way evidence is selected may influence its value in court. Although reality is far from a probabilistic

There are few well-designed studies of corrections or prison nursing roles. This study seeks to describe the corrections or prison role of forensic nurses in the United States who provide care in secure environments. National data detailing the scope of practice in secure environments are limited. This pencil and paper survey describes the roles of 180 forensic nurses from 14 states who work in secure environments. Descriptive statistics are utilized. A repeated measures ANOVA with post hoc analyses was implemented. These nurses were older than average in age, but had 10 years or less experience in forensic nursing practice. Two significant roles emerged to "promote and implement principles that underpin effective quality and practice" and to "assess, develop, implement, and improve programs of care for individuals." Significant roles varied based upon the security classification of the unit or institution in which the nurses were employed. Access to information about these nurses and their nursing practice was difficult in these closed systems. Minimal data are available nationally, indicating a need for collection of additional data over time to examine changes in role. It is through such developments that forensic nursing provided in secure environments will define its specialization and attract the attention it deserves.

If you are a digital forensics examiner daily involved in the acquisition and analysis of mobile devices and want to have a complete overview of how to perform your work on iOS devices, this book is definitely for you.

Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

Much of our knowledge of Earth's interior is based on seismic observations and measurements. Adjoint methods provide an efficient way of incorporating 3D full wave propagation in iterative seismic inversions to enhance tomographic images and thus our understanding of processes taking place inside the Earth. Our aim is to take adjoint tomography, which has been successfully applied to regional and continental scale problems, further to image the entire planet. This is one of the extreme imaging challenges in seismology, mainly due to the intense computational requirements and vast amount of high-quality seismic data that can potentially be assimilated. We have started low-resolution inversions (T > 30 s and T > 60 s for body and surface waves, respectively) with a limited data set (253 carefully selected earthquakes and seismic data from permanent and temporary networks) on Oak Ridge National Laboratory's Cray XK7 "Titan" system. Recent improvements in our 3D global wave propagation solvers, such as a GPU version of the SPECFEM3D_GLOBE package, will enable us perform higher-resolution (T > 9 s) and longer duration (~180 m) simulations to take the advantage of high-frequency body waves and major-arc surface waves, thereby improving imbalanced ray coverage as a result of the uneven global distribution of sources and receivers. Our ultimate goal is to use all earthquakes in the global CMT catalogue within the magnitude range of our interest and data from all available seismic networks. To take the full advantage of computational resources, we need a solid framework to manage big data sets during numerical simulations, pre-processing (i.e., data requests and quality checks, processing data, window selection, etc.) and post-processing (i.e., pre-conditioning and smoothing kernels, etc.). We address the bottlenecks in our global seismic workflow, which are mainly coming from heavy I/O traffic during simulations and the pre- and post-processing stages, by defining new data

The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

, soil) adhering to a meteorite are samples of the actual physical environment in which the meteorite rested. Adhesion may derive from chemical cementation (incl. rust from the meteorite), biologic activity (incl. desert varnish?), or impact processes [2]. Given the wide diversity of geological materials and processes on the Earth, adhering geological materials may be useful forensic tools. For instance, fall in a volcanic terrane may be inconsistent with adhering sediments of clean quartz sand. Biologic matter on meteorites includes animal and vegetable matter mixed with the adhering geological materials, lichens and other plants growing in place, and purposefully attached animal matter (e.g. insect eggs). The most useful biological data may be provided by pollen, which can often be referred unambiguously to genera and species of plants. For example, sediments adhering to meteorites from the central Nullabor Plain (W. Australia) are different from sediments from the Plain's margin in S. Australia. Sediment on meteorites from the central Nullabor (e.g. Mundrabilla) lacks quartz sand and consists almost entirely of clay-sized particles, consistent with derivation from the local saprolitic soil. Sediment on meteorites from the eastern Nullabor (e.g. Hughes and Cook, S.A.) contains a significant fraction of quartz sand, 1/4- to 1/2-mm grains, probably blown from the Great Victoria Desert to the north and northwest. However, sedimentologic data alone may be misleading. For instance, sediments adhering to Nuevo Mercurio stones (H5; Zacatecas, Mexico) are clay-sized and lack coarser material. But sediment on Nuevo Mercurio (b), a ureilite found in the Nuevo Mercurio strewn field, consists of quartz sand and clay pellets, 1/4 to 1/2 mm diameter. Clearly, local environments may affect the character of sediment adhering to a meteorite, and careful detailed study may be required to determine whether a meteorite has been transported. I am grateful to R. Farrell and D. New for

Ferdinand de Montessus de Ballore was one of the founders of scientific seismology. He was a pioneer in seismology at the same level as Perrey, Mallet, Milne and Omori. He became familiar with earthquakes and volcanoes in Central America (1881-1885). After his experience in El Salvador his interest for understanding earthquakes and volcanoes oriented all of his life. Back in France he worked out a most complete world catalogue of earthquakes with 170.000 events (1885-1907), and completed his career being the head of the Chilean Seismological Service (1907-1923). Many of his ideas were in advance of later discoveries. He was an exceptional writer and published more than 30 books and hundreds of papers.

Glacier sliding plays a central role in ice dynamics. A number of remote sensing and deep drilling initiatives have therefore focused on the ice-bed interface. Although these techniques have provided valuable insights into bed properties, they do not supply theorists with data of sufficient temporal and spatial resolution to rigorously test mathematical sliding laws. As an alternative, passive seismic techniques have gained popularity in glacier monitoring. Analysis of glacier-related seismic sources ('icequakes') has become a useful technique to study inaccessible regions of the cryosphere, including the ice-bed interface. Seismic monitoring networks on the polar ice sheets have shown that ice sliding is not only a smooth process involving viscous deformation and regelation of basal ice layers. Instead, ice streams exhibit sudden slip episodes over their beds and intermittent phases of partial or complete stagnation. Here we discuss new and recently published discoveries of basal seismic sources beneath various glacial bodies. We revisit basal seismicity of hard-bedded Alpine glaciers, which is not the result of pure stick-slip motion. Sudden changes in seismicity suggest that the local configuration of the subglacial drainage system undergoes changes on sub daily time scales. Accordingly, such observations place constraints on basal resistance and sliding of hard-bedded glaciers. In contrast, certain clusters of stick-slip dislocations associated with micro seismicity beneath the Greenland ice sheet undergo diurnal variations in magnitudes and inter event times. This is best explained with a soft till bed, which hosts the shear dislocations and whose strength varies in response to changes in subglacial water pressure. These results suggest that analysis of basal icequakes is well suited for characterizing glacier and ice sheet beds. Future studies should address the relative importance between "smooth" and seismogenic sliding in different glacial environments.

Microorganisms have been used as weapons in criminal acts, most recently highlighted by the terrorist attack using anthrax in the fall of 2001. Although such ''biocrimes'' are few compared with other crimes, these acts raise questions about the ability to provide forensic evidence for criminal prosecution that can be used to identify the source of the microorganisms used as a weapon and, more importantly, the perpetrator of the crime. Microbiologists traditionally investigate the sources of microorganisms in epidemiological investigations, but rarely have been asked to assist in criminal investigations. A colloquium was convened by the American Academy of Microbiology in Burlington, Vermont, on June 7-9, 2002, in which 25 interdisciplinary, expert scientists representing evolutionary microbiology, ecology, genomics, genetics, bioinformatics, forensics, chemistry, and clinical microbiology, deliberated on issues in microbial forensics. The colloquium's purpose was to consider issues relating to microbial forensics, which included a detailed identification of a microorganism used in a bioattack and analysis of such a microorganism and related materials to identify its forensically meaningful source--the perpetrators of the bioattack. The colloquium examined the application of microbial forensics to assist in resolving biocrimes with a focus on what research and education are needed to facilitate the use of microbial forensics in criminal investigations and the subsequent prosecution of biocrimes, including acts of bioterrorism. First responders must consider forensic issues, such as proper collection of samples to allow for optimal laboratory testing, along with maintaining a chain of custody that will support eventual prosecution. Because a biocrime may not be immediately apparent, a linkage must be made between routine diagnosis, epidemiological investigation, and criminal investigation. There is a need for establishing standard operating

The derivation of the life quality index (LQI) is revisited for a revision. This revision takes into account the unpaid but necessary work time needed to stay alive in clean and healthy conditions to be fit for effective wealth producing work and to enjoyable free time. Dimension analysis...... at birth should not vary between countries. Finally the distributional assumptions are relaxed as compared to the assumptions made in an earlier work by the author. These assumptions concern the calculation of the life expectancy change due to the removal of an accident source. Moreover a simple public...... consistency problems with the standard power function expression of the LQI are pointed out. It is emphasized that the combination coefficient in the convex differential combination between the relative differential of the gross domestic product per capita and the relative differential of the expected life...

We revisit the quantum two-person duel. In this problem, both Alice and Bob each possess a spin-1/2 particle which models dead and alive states for each player. We review the Abbott and Flitney result—now considering non-zero α 1 and α 2 in order to decide if it is better for Alice to shoot or not the second time—and we also consider a duel where players do not necessarily start alive. This simple assumption allows us to explore several interesting special cases, namely how a dead player can win the duel shooting just once, or how can Bob revive Alice after one shot, and the better strategy for Alice—being either alive or in a superposition of alive and dead states—fighting a dead opponent. (paper)

In January 1994, the two geostationary satellites known as Anik-E1 and Anik-E2, operated by Telesat Canada, failed one after the other within 9 hours, leaving many northern Canadian communities without television and data services. The outage, which shut down much of the country's broadcast television for hours and cost Telesat Canada more than $15 million, generated significant media attention. Lam et al. used publicly available records to revisit the event; they looked at failure details, media coverage, recovery effort, and cost. They also used satellite and ground data to determine the precise causes of those satellite failures. The researchers traced the entire space weather event from conditions on the Sun through the interplanetary medium to the particle environment in geostationary orbit.

Purpose – The purpose of this paper is to learn more about logistics innovation processes and their implications for the focal organization as well as the supply chain, especially suppliers. Design/methodology/approach – The empirical basis of the study is a longitudinal action research project...... that was triggered by the practical needs of new ways of handling material flows of a hospital. This approach made it possible to revisit theory on logistics innovation process. Findings – Apart from the tangible benefits reported to the case hospital, five findings can be extracted from this study: the logistics...... innovation process model may include not just customers but also suppliers; logistics innovation in buyer-supplier relations may serve as an alternative to outsourcing; logistics innovation processes are dynamic and may improve supplier partnerships; logistics innovations in the supply chain are as dependent...

Much effort and research has been invested into understanding and bridging the ‘gaps’ which many students experience in terms of contents and expectations as they begin university studies with a heavy component of mathematics, typically in the form of calculus courses. We have several studies...... of bridging measures, success rates and many other aspects of these “entrance transition” problems. In this paper, we consider the inverse transition, experienced by university students as they revisit core parts of high school mathematics (in particular, calculus) after completing the undergraduate...... mathematics courses which are mandatory to become a high school teacher of mathematics. To what extent does the “advanced” experience enable them to approach the high school calculus in a deeper and more autonomous way ? To what extent can “capstone” courses support such an approach ? How could it be hindered...

Full Text Available The successful practice of dentistry involves a good combination of technical skills and soft skills. Soft skills or communication skills are not taught extensively in dental schools and it can be challenging to learn and at times in treating dental patients. Guiding the child′s behavior in the dental operatory is one of the preliminary steps to be taken by the pediatric dentist and one who can successfully modify the behavior can definitely pave the way for a life time comprehensive oral care. This article is an attempt to revisit a simple behavior guidance technique, reframing and explain the possible psychological perspectives behind it for better use in the clinical practice.

ObsPy (https://www.obspy.org) is a community-driven, open-source project dedicated to offer a bridge for seismology into the scientific Python ecosystem. Amongst other things, it provides Read and write support for essentially every commonly used data format in seismology with a unified interface. This includes waveform data as well as station and event meta information. A signal processing toolbox tuned to the specific needs of seismologists. Integrated access to the largest data centers, web services, and databases. Wrappers around third party codes like libmseed and evalresp. Using ObsPy enables users to take advantage of the vast scientific ecosystem that has developed around Python. In contrast to many other programming languages and tools, Python is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often must be translated to stable and production ready environments, especially in the age of big data. ObsPy has seen constant development for more than six years and enjoys a large rate of adoption in the seismological community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. We will present a short overview of the capabilities of ObsPy and point out several representative use cases and more specialized software built around ObsPy. Additionally we will discuss new and upcoming features, as well as the sustainability of open-source scientific software.

Two experiments were performed to test the relevance of bryophyte (Plantae, Bryophyta) material for forensic studies. The first experiment was conducted to reveal if, and how well, plant fragments attach to footwear in general. In the test, 16 persons walked outdoors wearing rubber boots or hiking boots. After 24h of use outdoors the boots were carefully cleaned, and all plant fragments were collected. Afterwards, all plant material was examined to identify the species. In the second experiment, fresh material of nine bryophyte species was kept in a shed in adverse conditions for 18 months, after which DNA was extracted and subjected to genotyping to test the quality of the material. Both experiments give support for the usability of bryophyte material in forensic studies. The bryophyte fragments become attached to shoes, where they remain even after the wearer walks on a dry road for several hours. Bryophyte DNA stays intact, allowing DNA profiling after lengthy periods following detachment from the original plant source. Based on these experiments, and considering the fact that many bryophytes are clonal plants, we propose that bryophytes are among the most usable plants to provide botanical evidence for forensic investigations.

Environmental forensic microscopy investigations are based on the methods and procedures developed in the fields of criminal forensics, industrial hygiene and environmental monitoring. Using a variety of microscopes and techniques, the environmental forensic scientist attempts to reconstruct the sources and the extent of exposure based on the physical evidence left behind after particles are exchanged between an individual and the environments he or she passes through. This article describes how environmental forensic microscopy uses procedures developed for environmental monitoring, criminal forensics and industrial hygiene investigations. It provides key references to the interdisciplinary approach used in microscopic investigations. Case studies dealing with lead, asbestos, glass fibers and other particulate contaminants are used to illustrate how environmental forensic microscopy can be very useful in the initial stages of a variety of environmental exposure characterization efforts to eliminate some agents of concern and to narrow the field of possible sources of exposure.

Following the 2010 Nuclear Security Summit, Canada expanded its existing capability for nuclear forensics by establishing a national nuclear forensics laboratory network, which would include a capability to perform forensic analysis on nuclear and other radioactive material, as well as on traditional evidence contaminated with radioactive material. At the same time, the need for a national nuclear forensics library of signatures of nuclear and radioactive materials under Canadian regulatory control was recognized. The Canadian Safety and Security Program, administered by Defence Research and Development Canada's Centre for Security Science (DRDC CSS), funds science and technology initiatives to enhance Canada's preparedness for prevention of and response to potential threats. DRDC CSS, with assistance from Canadian Nuclear Laboratories, formerly Atomic Energy of Canada Limited, is leading the Canadian National Nuclear Forensics Capability Project to develop a coordinated, comprehensive, and timely national nuclear forensics capability. (author)

Forensic science, commonly referred to as forensics, is the examination of physical, biological, behavioural and documentary evidence. The goal of forensics is to discover linkages among people, places, things and events. A sub-discipline of forensic science, nuclear forensics is the analysis of intercepted illicit nuclear or radioactive material and any associated material, which can assist in law enforcement investigations as well as assessments of the potential vulnerabilities associated with the use, production and storage of these materials as part of a nuclear security infrastructure. The analysis of nuclear or other radioactive material seeks to identify what the materials are, how, when, and where the materials were made, and what their intended uses were. Nuclear forensics is an important tool in the fight against illicit trafficking in nuclear and radiological material

Following the 2010 Nuclear Security Summit, Canada expanded its existing capability for nuclear forensics by establishing a national nuclear forensics laboratory network, which would include a capability to perform forensic analysis on nuclear and other radioactive material, as well as on traditional evidence contaminated with radioactive material. At the same time, the need for a national nuclear forensics library of signatures of nuclear and radioactive materials under Canadian regulatory control was recognized. The Canadian Safety and Security Program, administered by Defence Research and Development Canada's Centre for Security Science (DRDC CSS), funds science and technology initiatives to enhance Canada's preparedness for prevention of and response to potential threats. DRDC CSS, with assistance from Canadian Nuclear Laboratories, formerly Atomic Energy of Canada Limited, is leading the Canadian National Nuclear Forensics Capability Project to develop a coordinated, comprehensive, and timely national nuclear forensics capability. (author)

This PhD thesis deals with statistical models intended for forensic genetics, which is the part of forensic medicine concerned with analysis of DNA evidence from criminal cases together with calculation of alleged paternity and affinity in family reunification cases. The main focus of the thesis...... is on crime cases as these differ from the other types of cases since the biological material often is used for person identification contrary to affinity. Common to all cases, however, is that the DNA is used as evidence in order to assess the probability of observing the biological material given different...... of the DNA evidence under competing hypotheses the biological evidence may be used in the court’s deliberation and trial on equal footing with other evidence and expert statements. These probabilities are based on population genetic models whose assumptions must be validated. The thesis’s first two articles...

The emerging field of forensic herpetology is reviewed. This research focus, defined here as the application of science to studies of reptiles and amphibians when these animals become the subject of legal investigations, has gained increasing attention in recent years. A diverse range of experts contributes to methods in forensic herpetology including forensic scientists, herpetologists, veterinarians, zookeepers, physicians, pathologists and toxicologists. The English language literature in ...

IT security and computer forensics are important components in the information technology. In the present study, a client-side Skype forensics is performed. It is designed to explain which kind of user data are stored on a computer and which tools allow the extraction of those data for a forensic investigation. There are described both methods - a manual analysis and an analysis with (mainly) open source tools, respectively.

A comprehensive radiochemical isolation procedure and data analysis/interpretation method for the nuclear forensic investigation of Th has been developed. The protocol includes sample dissolution, chemical separation, nuclear counting techniques, consideration of isotopic parent-daughter equilibria, and data interpretation tactics. Practical application of the technology was demonstrated by analyses of a questioned specimen confiscated at an illegal drug synthesis laboratory by law enforcement authorities. (author)

Law practitioners are in an uninterrupted battle with criminals in the application of digital/computertechnologies, and require the development of a proper methodology to systematically searchdigital devices for significant evidence. Computer fraud and digital crimes are growing day by dayand unfortunately less than two percent of the reported cases result in confidence. This paperexplores the development of the digital forensics process model, compares digital forensicmethodologies, and fina...

We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.

Animal abuse is important social issue, which includes a wide range of behaviors of humans that are harmful to animals, starting from unintentional neglect to intentional cruelty. Types of animal abuse are different and they can include physical, sexual, emotional abuse or neglect. Training dogs for fights and dog fighting are considered to be neglection of animals. Forensic veterinarians are called for testifining more often now for presenting the evidence that can lead to making a case rega...

Forensic entomology is the science of collecting and analysing insect evidence to aid in forensic investigations. Its main application is in the determination of the minimum time since death in cases of suspicious death, either by estimating the age of the oldest necrophagous insects that developed on the corpse, or by analysing the insect species composition on the corpse. In addition, toxicological and molecular examinations of these insects may help reveal the cause of death or even the identity of a victim, by associating a larva with its last meal, for example, in cases where insect evidence is left at a scene after human remains have been deliberately removed. Some fly species can develop not only on corpses but on living bodies too, causing myiasis. Analysis of larvae in such cases can demonstrate the period of neglect of humans or animals. Without the appropriate professional collection of insect evidence, an accurate and convincing presentation of such evidence in court will be hampered or even impossible. The present paper describes the principles and methods of forensic entomology and the optimal techniques for collecting insect evidence.

DNA fingerprinting, one of the great discoveries of the late 20th century, has revolutionized forensic investigations. This review briefly recapitulates 30 years of progress in forensic DNA analysis which helps to convict criminals, exonerate the wrongly accused, and identify victims of crime, disasters, and war. Current standard methods based on short tandem repeats (STRs) as well as lineage markers (Y chromosome, mitochondrial DNA) are covered and applications are illustrated by casework examples. Benefits and risks of expanding forensic DNA databases are discussed and we ask what the future holds for forensic DNA fingerprinting.

Works cited in six forensic psychology journals published 2008-2010 were counted to identify the most frequently cited journals. The sample of works cited (N = 21,776) was not a definitive ranked list of important journals in forensic psychology, but was large enough to indicate high-impact journals. The list of frequently cited publications included more general psychiatry and psychology journals than titles specific to forensic psychology. The implications of the proportion of general versus specific titles for collections supporting research in forensic psychology were discussed.

The Network of Research Infrastructures for European Seismology (NERIES) is European Commission (EC) project whose focus is networking together seismological observatories and research institutes into one integrated European infrastructure that provides access to data and data products for research. Seismological institutes and organizations in European and Mediterranean countries maintain large, geographically distributed data archives, therefore this scenario suggested a design approach bas...

The UK School Seismology Project started in 2007. King Edward VI High School for Girls was one of the fortunate schools to obtain a school seismometer system, free of charge, as an early adopter of the resource. This report outlines our experiences with the system over the past 10 years and describes our recent research on the relationship between…

In order to assess the problems which might arise from monitoring a comprehensive test ban treaty by seismological methods, an experimental monitoring operation is being conducted. This work has involved the establishment of a database on the Rutherford Laboratory 360/195 system computer. The database can be accessed in the UK over the public telephone network and in the USA via ARPANET. (author)

SGRAPH program is considered one of the seismological programs that maintain seismic data. SGRAPH is considered unique for being able to read a wide range of data formats and manipulate complementary tools in different seismological subjects in a stand-alone Windows-based application. SGRAPH efficiently performs the basic waveform analysis and solves advanced seismological problems. The graphical user interface (GUI) utilities and the Windows facilities such as, dialog boxes, menus, and toolbars simplified the user interaction with data. SGRAPH supported the common data formats like, SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and others. It provides the facilities to solve many seismological problems with the built-in inversion and modeling tools. In this paper, I discuss some of the inversion tools built-in SGRAPH related to source parameters and hypocentral location estimation. Firstly, a description of the SGRAPH program is given discussing some of its features. Secondly, the inversion tools are applied to some selected events of the Dahshour earthquakes as an example of estimating the spectral and source parameters of local earthquakes. In addition, the hypocentral location of these events are estimated using the Hypoinverse 2000 program operated by SGRAPH.

We have initiated a community platform (http://www.seismo-live.org) where Python-based Jupyter notebooks (https://jupyter.org) can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow the combination of markup language, graphics, and equations with interactive, executable Python code examples. Jupyter notebooks are a powerful and easy-to-grasp tool for students to develop entire projects, scientists to collaborate and efficiently interchange evolving workflows, and trainers to develop efficient practical material. Utilizing the tmpnb project (https://github.com/jupyter/tmpnb), we link the power of Jupyter notebooks with an underlying server, such that notebooks can be run from anywhere, even on smart phones. We demonstrate the potential with notebooks for 1) learning the programming language Python, 2) basic signal processing, 3) an introduction to the ObsPy library (https://obspy.org) for seismology, 4) seismic noise analysis, 5) an entire suite of notebooks for computational seismology (the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin methods, Instaseis), 6) rotational seismology, 7) making results in papers fully reproducible, 8) a rate-and-state friction toolkit, 9) glacial seismology. The platform is run as a community project using Github. Submission of complementary Jupyter notebooks is encouraged. Extension in the near future include linear(-ized) and nonlinear inverse problems.

Full Text Available The Arab Society for Forensic Sciences and Forensic Medicine (ASFSFM at Naif Arab University for Security Sciences seeks to present the latest developments in all fields of forensic sciences through holding specialized scientific events and academic activities. This is also achieved through its periodic scientific peer-reviewed journal, the Arab Journal of Forensic Sciences and Forensic Medicine. It also seeks to promote scientific research in all fields of forensic science and forensic medicine, and seeks actively to contribute in holding scientific meetings in accordance with advanced scientific standards, including the 3rd International Arab Forensic Sciences & Forensic Medicine Conference. This important event was attended by scientists and experts from various fields of criminal and forensic sciences from both Arab and non-Arab countries. This conference was a significant scientific accomplishment that contributed to the advancement of forensic sciences and forensic medicine in the Arab world. The conference aimed, in accordance with the vision of Naif Arab University for Security Sciences, to enhance peace, security and justice in Arab societies. Naif Arab University for Security Sciences, represented by the Arab Society for Forensic Sciences and Forensic Medicine, held the 3rd International Arab Forensic Sciences & Forensic Medicine Conference on the University's campus during the period from 21st to 23rd November 2017. The event included the participation of more than 720 experts in forensic sciences and forensic medicine from 33 countries all over the world. Experts discussed and presented the latest developments in their fields. The conference provided a creative environment for students from both local and international universities to benefit from experts and specialists, and to access the most recent research. On behalf of His Excellency the president of Naif Arab University for Security Sciences, and the Arab Society for

Forensic entomology is the study of insects/arthropods in criminal investigation. Right from the early stages insects are attracted to the decomposing body and may lay eggs in it. By studying the insect population and the developing larval stages, forensic scientists can estimate the postmortem index, any change in position of the corpse as well as the cause of death. Forensic odontologists are called upon more frequently to collaborate in criminal investigations and hence should be aware of the possibilities that forensic entomology have to offer and use it as an adjunct to the conventional means of forensic investigation.

Since 2014, the Seismology Division (SM) of the European Geosciences Union (EGU) has its Early Career Scientist (ECS) representative to reach out to its numerous 'younger' members. In April 2016, a new team of representatives joined the Division. We are a vivid team of early career scientists, representing both (either) PhD students and post-doctoral researchers working in different seismological disciplines and different countries. The initiatives of the SM ECS-rep team have various aims: (1) to motivate the ECSs to get involved in activities and initiatives of the EGU and the Seismology Division, (2) to promote the research of ECSs, (3) to discuss issues concerning seismologists during this particular stage of their career, (4) to share ideas on how to promote equality between scientists and (5) to improve on the public dissemination of scientific knowledge. In an effort to reach out to experienced and ECS seismologists more effectively and to continuously encourage to voice their ideas by contributing and following our initiatives, a blog and social media pages dedicated to seismology and earthquake trivia are run by the team. Weekly posts are published on the blog and shared on the social media regarding scientific and social aspects of seismology. One of the major contributions recently introduced to the blog is the "Paper of the Month" series where experienced seismologists write about recent or classical - must read - seismology articles. We also aim to organise and promote social and scientific events. During the EGU General Assembly 2016 a social event was held in Vienna allowing ECS to network with peers in an informal environment. Given the success of this event, a similar event will be organized during the General Assembly 2017. Also, similar to previous years, a short course on basic seismology for non seismologists will be requested and offered to all ECSs attending the General Assembly. Finally, a workshop dedicated entirely to ECSs seismologists

The central problem investigated in this thesis is nuclear forensic support in Sudan, the thesis comprises five chapters, began with an introduction containing the art of forensic science, stated the importance of the it in criminal investigations. The forensic science was defined, and stated the principle of which it underlying, including: principle of individuality and principle of exchange, the divisions of this science has been clarified, then it discussed the crime scene and the collecting of evidence, where starting the forensic science at the crime scene, with clarifying the principle of crime scene investigation. Nuclear and other radioactive material was discussed: defining a radioactivity with the material source. It placed into 3 general categories: special nuclear materials, reactor fuel, and commercial radioactive sources, and mention each category and it characteristics. Radiation is part of our environment was clarified, and discussed what the effect on organisms and populations are. Nuclear forensics was presented,and how problem of the safeguarding of the nuclear material beginning. The emerging nature of the problem was discussed, the radiological crime scene management was explained, importance of securing the scene with an examples of equipment and instruments for on-scene radiation safety assessment and how the collection of evidence, storage forensic laboratory analysis was discussed and how set the designated nuclear forensic laboratory, also nuclear forensic interpretation, and the chain of custody was mentioned. The role of Regulating Authority in Nuclear forensic support was discussed, specifically in Sudan, International Cooperation have also been reminded, as well as memorandum of understanding was mentioned between SNRRA and the administration of forensic evidence, and one of it results is the radiological surveys unit in forensic administration, how the unit is configured, the role of the unit, finally conclusion of research was

Full Text Available One of the main objectives of Naif Arab University for Security Sciences (NAUSS is to enhance peace, security, and justice in Arab societies through education, research, and advanced professional training in various disciplines of security and forensic sciences. NAUSS strives to improve the academic and professional skills of forensic scientists and security personnel to combat crime and terrorism by utilizing all the available tools of modern technology. NAUSS also realizes the importance of scientific research in the social, economic, and technological development of a society and is, therefore, committed to encouraging and supporting research at every level. NAUSS has given the fields of forensic sciences and forensic medicine a top priority and the attention they deserve. In pursuit of its objectives, and in cooperation with other Arab member organizations, NAUSS launched the Arab Society for Forensic Sciences and Forensic Medicine (ASFSFM in 2013. The Society had the honour of being officially launched by His Royal Highness, Prince Mohammed bin Naif bin Abdul Aziz, Crown Prince, Deputy Prime Minister and the Minister of the Interior, Honorary President of the Council of Arab Ministers of Interior and Chairman of the Supreme Council of NAUSS. The 2nd Arab Forensic Science & Forensic Medicine Meeting (ASFSFM Meeting 2016 was yet another part of the efforts and concern of NAUSS to advance the skills and knowledge of Arab specialists and to facilitate cooperation among forensic scientists and institutions engaged in the practice, education and research of forensic sciences and forensic medicine at various levels.

Digital forensics and multimedia forensics are rapidly growing disciplines whereby electronic information is extracted and interpreted for use in a court of law. These two fields are finding increasing importance in law enforcement and the investigation of cybercrime as the ubiquity of personal computing and the internet becomes ever-more apparent. Digital forensics involves investigating computer systems and digital artefacts in general, while multimedia forensics is a sub-topic of digital forensics focusing on evidence extracted from both normal computer systems and special multimedia devices, such as digital cameras. This book focuses on the interface between digital forensics and multimedia forensics, bringing two closely related fields of forensic expertise together to identify and understand the current state-of-the-art in digital forensic investigation. Both fields are expertly attended to by contributions from researchers and forensic practitioners specializ ng in diverse topics such as forensic aut...

Although numerous seismological programs are currently available, most of them suffer from the inability to manipulate different data formats and the lack of embedded seismological tools. SeismoGRAPHer, or simply SGRAPH, is a new system for maintaining and analyzing seismic waveform data in a stand-alone, Windows-based application that manipulates a wide range of data formats. SGRAPH was intended to be a tool sufficient for performing basic waveform analysis and solving advanced seismological problems. The graphical user interface (GUI) utilities and the Windows functionalities, such as dialog boxes, menus, and toolbars, simplify the user interaction with the data. SGRAPH supports common data formats, such as SAC, SEED, GSE, ASCII, and Nanometrics Y-format, and provides the ability to solve many seismological problems with built-in inversion tools. Loaded traces are maintained, processed, plotted, and saved as SAC, ASCII, or PS (post script) file formats. SGRAPH includes Generalized Ray Theory (GRT), genetic algorithm (GA), least-square fitting, auto-picking, fast Fourier transforms (FFT), and many additional tools. This program provides rapid estimation of earthquake source parameters, location, attenuation, and focal mechanisms. Advanced waveform modeling techniques are provided for crustal structure and focal mechanism estimation. SGRAPH has been employed in the Egyptian National Seismic Network (ENSN) as a tool assisting with routine work and data analysis. More than 30 users have been using previous versions of SGRAPH in their research for more than 3 years. The main features of this application are ease of use, speed, small disk space requirements, and the absence of third-party developed components. Because of its architectural structure, SGRAPH can be interfaced with newly developed methods or applications in seismology. A complete setup file, including the SGRAPH package with the online user guide, is available.

The Incorporated Research Institutions for Seismology's Education and Public Outreach (EPO) program is committed to advancing awareness and understanding of seismology and geophysics, while inspiring careers in the Earth sciences. To achieve this mission, IRIS EPO combines content and research expertise of consortium membership with educational and outreach expertise of IRIS staff to create a portfolio of programs, products, and services that target a range of audiences, including grades 6-12 students and teachers, undergraduate and graduate students, faculty, and the general public. IRIS also partners with UNAVCO and other organizations in support of EarthScope where the facilities are well-suited for sustained engagement of multiple audiences. Examples of research-related EPO products and services include the following resources. Tools developed in collaboration with IRIS Data Services provide public and educational access to data, and to a suite of data products. Teachers can stream seismic data from educational or research sensors into their classroom, and the Active Earth Monitor display, designed for visitor centers, universities and small museums, provides views of recent data along with animations that explain seismology concepts, and stories about recent research. Teachable Moment slide sets, created in collaboration with the University of Portland within 24 hours of major earthquakes, provide interpreted USGS tectonic maps and summaries, animations, visualizations, and other event-specific information so educators can explore newsworthy earthquakes with their students. Intro undergraduate classroom activities have been designed to introduce students to some grand challenges in seismological research, while our Research Experiences for Undergraduates program pairs students with seismology researchers throughout the Consortium and provides the opportunity for the students to present their research at a national meeting. EPO activities are evaluated via a

The European Geosciences Union is a bottom-up-organisation, in which its members are represented by their respective scientific divisions, committees and council. In recent years, EGU has embarked on a mission to reach out for its numerous 'younger' members by giving awards to outstanding young scientists and the setting up of Early Career Scientists (ECS) representatives. The division representative's role is to engage in discussions that concern students and early career scientists. Several meetings between all the division representatives are held throughout the year to discuss ideas and Union-wide issues. One important impact ECS representatives have had on EGU is the increased number of short courses and workshops run by ECS during the annual General Assembly. Another important contribution of ECS representatives was redefining 'Young Scientist' to 'Early Career Scientist', which avoids discrimination due to age. Since 2014, the Seismology Division has its own ECS representative. In an effort to more effectively reach out for young seismologists, a blog and a social media page dedicated to seismology have been set up online. With this dedicated blog, we'd like to give more depth to the average browsing experience by enabling young researchers to explore various seismology topics in one place while making the field more exciting and accessible to the broader community. These pages are used to promote the latest research especially of young seismologists and to share interesting seismo-news. Over the months the pages proved to be popular, with hundreds of views every week and an increased number of followers. An online survey was conducted to learn more about the activities and needs of early career seismologists. We present the results from this survey, and the work that has been carried out over the last two years, including detail of what has been achieved so far, and what we would like the ECS representation for Seismology to achieve. Young seismologists are

The neutron population in a prototype model of nuclear reactor can be described in terms of a collection of particles confined in a box and undergoing three key random mechanisms: diffusion, reproduction due to fissions, and death due to absorption events. When the reactor is operated at the critical point, and fissions are exactly compensated by absorptions, the whole neutron population might in principle go to extinction because of the wild fluctuations induced by births and deaths. This phenomenon, which has been named critical catastrophe, is nonetheless never observed in practice: feedback mechanisms acting on the total population, such as human intervention, have a stabilizing effect. In this work, we revisit the critical catastrophe by investigating the spatial behaviour of the fluctuations in a confined geometry. When the system is free to evolve, the neutrons may display a wild patchiness (clustering). On the contrary, imposing a population control on the total population acts also against the local fluctuations, and may thus inhibit the spatial clustering. The effectiveness of population control in quenching spatial fluctuations will be shown to depend on the competition between the mixing time of the neutrons (i.e. the average time taken for a particle to explore the finite viable space) and the extinction time

Consideration of core polarization, isobar currents and meson-exchange processes gives a satisfactory understanding of the ground-state magnetic moments in closed-shell-plus (or minus)-one nuclei, A = 3, 15, 17, 39 and 41. Ever since the earliest days of the nuclear shell model the understanding of magnetic moments of nuclear states of supposedly simple configurations, such as doubly closed LS shells +-1 nucleon, has been a challenge for theorists. The experimental moments, which in most cases are known with extraordinary precision, show a small yet significant departure from the single-particle Schmidt values. The departure, however, is difficult to evaluate precisely since, as will be seen, it results from a sensitive cancellation between several competing corrections each of which can be as large as the observed discrepancy. This, then, is the continuing fascination of magnetic moments. In this contribution, we revisit the subjet principally to identify the role played by isobar currents, which are of much concern at this conference. But in so doing we warn quite strongly of the dangers of considering just isobar currents in isolation; equal consideration must be given to competing processes which in this context are the mundane nuclear structure effects, such as core polarization, and the more popular meson-exchange currents

We revisit here the naturalness problem of Lorentz invariance violations on a simple toy model of a scalar field coupled to a fermion field via a Yukawa interaction. We first review some well-known results concerning the low-energy percolation of Lorentz violation from high energies, presenting some details of the analysis not explicitly discussed in the literature and discussing some previously unnoticed subtleties. We then show how a separation between the scale of validity of the effective field theory and that one of Lorentz invariance violations can hinder this low-energy percolation. While such protection mechanism was previously considered in the literature, we provide here a simple illustration of how it works and of its general features. Finally, we consider a case in which dissipation is present, showing that the dissipative behaviour does not percolate generically to lower mass dimension operators albeit dispersion does. Moreover, we show that a scale separation can protect from unsuppressed low-energy percolation also in this case.

The aim of this entry is to describe and explain the main forensic uses of fingermarks and fingerprints. It defines the concepts and provides the nomenclature related to forensic dactyloscopy. It describes the structure of the papillary ridges, the organization of the information in three levels,

Forensics (forensic--before the Forum) means the application of knowledge from different scientific fields in order to define facts in judicial and/or administrative procedures. Nowadays forensics, besides this, finds its application even in different economic processes. For example, forensics enters the commercial areas of business intelligence and of different security areas. The European Commission recognized the importance of forensics, and underscored the importance of development of its scientific infrastructure in member States. We are witnessing the rise of various tragedies in economic and other kinds of processes. Undoubtedly, the world is increasingly exposed to various forms of threats whose occurrences regularly involve people. In this paper we are proposing the development of a new approach in the forensic assessment of the state of human resources. We are suggesting that in the focus should be the forensic approach in the psychological assessment of awareness of the individual and of the critical infrastructure sector operator (CISO) in determining the level of actual practical, rather than formal knowledge of an individual in a particular field of expertise, or in a specific scientific field, and possible forensic meanings.

This article reports on a study carried out on the role constructs of forensic and nonforensic Learning Disability Nursing in relation to six binary themes. The aims were to identify if there were differences in perceptions of forensic learning disability nurses and nonforensic learning disability nurses in relation to the six binary themes of the…

Animal forensic DNA analysis is being used for human criminal investigations (e.g traces from cats and dogs), wildlife management, breeding and food safety. The most common DNA markers used for such forensic casework are short tandem repeats (STR). Rules and guidelines concerning quality assurance

The DNA Commission of the International Society for Forensic Genetics (ISFG) is reviewing factors that need to be considered ahead of the adoption by the forensic community of short tandem repeat (STR) genotyping by massively parallel sequencing (MPS) technologies. MPS produces sequence data that...

Objective: The Accreditation Council on Graduate Medical Education (ACGME) requires that general psychiatry residency training programs provide trainees with exposure to forensic psychiatry. Limited information is available on how to develop a core curriculum in forensic psychiatry for general psychiatry residents and few articles have been…

Full Text Available In business today, one of the most important segments that enable any business to get competitive advantage over others is appropriate, effective adaptation of Information Technology into business and then managing and governing it on their will. To govern IT organizations need to identify value of acquiring services of forensic firms to compete cyber criminals. Digital forensic firms follow different mechanisms to perform investigation. Time by time forensic firms are facilitated with different models for investigation containing phases for different purposes of the entire process. Along with forensic firms, enterprises also need to build a secure and supportive platform to make successful investigation process possible. We have underlined different elements of organizations in Pakistan; need to be addressed to provide support to forensic firms.

The goal of the paper is to revisit and analyze key contributions to the understanding of leadership and management. As a part of the discussion a role perspective that allows for additional and/or integrated leader dimensions, including a change-centered, will be outlined. Seemingly, a major...

We revisit the idea of ``inter-genre similarity'' (IGS) for machine learning in general, and music genre recognition in particular. We show analytically that the probability of error for IGS is higher than naive Bayes classification with zero-one loss (NB). We show empirically that IGS does...... not perform well, even for data that satisfies all its assumptions....

Over the past twenty years, DNA analysis has revolutionized forensic science, and has become a dominant tool in law enforcement. Today, DNA evidence is key to the conviction or exoneration of suspects of various types of crime, from theft to rape and murder. However, the disturbing possibility that DNA evidence can be faked has been overlooked. It turns out that standard molecular biology techniques such as PCR, molecular cloning, and recently developed whole genome amplification (WGA), enable anyone with basic equipment and know-how to produce practically unlimited amounts of in vitro synthesized (artificial) DNA with any desired genetic profile. This artificial DNA can then be applied to surfaces of objects or incorporated into genuine human tissues and planted in crime scenes. Here we show that the current forensic procedure fails to distinguish between such samples of blood, saliva, and touched surfaces with artificial DNA, and corresponding samples with in vivo generated (natural) DNA. Furthermore, genotyping of both artificial and natural samples with Profiler Plus((R)) yielded full profiles with no anomalies. In order to effectively deal with this problem, we developed an authentication assay, which distinguishes between natural and artificial DNA based on methylation analysis of a set of genomic loci: in natural DNA, some loci are methylated and others are unmethylated, while in artificial DNA all loci are unmethylated. The assay was tested on natural and artificial samples of blood, saliva, and touched surfaces, with complete success. Adopting an authentication assay for casework samples as part of the forensic procedure is necessary for maintaining the high credibility of DNA evidence in the judiciary system.

Radiochemistry has been used to study fission since it’ discovery. Radiochemical methods are used to determine cumulative mass yields. These measurements have led to the two-mode fission hypothesis to model the neutron energy dependence of fission product yields. Fission product yields can be used for the nuclear forensics of nuclear explosions. The mass yield curve depends on both the fuel and the neutron spectrum of a device. Recent studies have shown that the nuclear structure of the compound nucleus can affect the mass yield distribution.

This paper examines the problem of suicide among patients discharged from a Regional Secure Unit. The stereotype that emerges is a young man with anti-social personality traits, suffering from an affective psychosis, with a history of substance abuse and impulsive violence directed both towards himself and others, who is alienated from care staff and social supports because of his provocative and uncooperative behaviour. In contrast with the general population, forensic patients are more likely to commit suicide using a violent method and are more likely to have a suicide verdict recorded by the coroner. The implications of these findings for treatment and preventive interventions are discussed.

The aim of this article is to shed light on contemporary forensic psychiatric care through a philosophical examination of the empirical results from two lifeworld phenomenological studies from the perspective of patients and carers, by using the French philosopher Michel Foucault's historical-philosophical work. Both empirical studies were conducted in a forensic psychiatric setting. The essential results of the two empirical studies were reexamined in a phenomenological meaning analysis to form a new general structure in accordance with the methodological principles of Reflective Lifeworld Research. This general structure shows how the caring on the forensic psychiatric wards appears to be contradictory, in that it is characterized by an unreflective (non-)caring attitude and contributes to an inconsistent and insecure existence. The caring appears to have a corrective approach and thus lacks a clear caring structure, a basic caring approach that patients in forensic psychiatric services have a great need of. To gain a greater understanding of forensic psychiatric caring, the new empirical results were further examined in the light of Foucault's historical-philosophical work. The philosophical examination is presented in terms of the three meaning constituents: Caring as correction and discipline, The existence of power, and Structures and culture in care. The philosophical examination illustrates new meaning nuances of the corrective and disciplinary nature of forensic psychiatric care, its power, and how this is materialized in caring, and what this does to the patients. The examination reveals embedded difficulties in forensic psychiatric care and highlights a need to revisit the aim of such care.

Producing qualified forensic pathological practitioners is a common difficulty around the world. In China, forensic pathology is one of the required major subspecialties for undergraduates majoring in forensic medicine, in contrast to forensic education in Western countries where forensic pathology is often optional. The enduring predicament is that the professional qualities and abilities of forensic students from different institutions vary due to the lack of an efficient forensic pedagogical model. The purpose of this article is to describe the new pedagogical model of forensic pathology at Zhongshan School of Medicine, Sun Yat-sen University, which is characterised by: (a) imparting a broad view of forensic pathology and basic knowledge of duties and tasks in future careers to students; (b) educating students in primary skills on legal and medical issues, as well as advanced forensic pathological techniques; (c) providing students with resources to broaden their professional minds, and opportunities to improve their professional qualities and abilities; and (d) mentoring students on occupational preparation and further forensic education. In the past few years, this model has resulted in numerous notable forensic students accomplishing achievements in forensic practice and forensic scientific research. We therefore expect this pedagogical model to establish the foundation for forensic pathological education and other subspecialties of forensic medicine in China and abroad.

Full Text Available Clinical forensic medicine is a progressing branch. In Indonesia and Malaysia, there is inadequate information regarding this practice. It is always unclear about the job scopes and practitioners involved in this field. The study outlined in this article is aimed to explore the current clinical forensic medicine practice compared to existing systematic practice globally and hence analyzing for presence of difference in this practice between these two countries. A qualitative study was conducted by forensic experts in Indonesia and Malaysia from September to November 2015. In-depth interview was carried out to obtain data which were then validated using literature and legal documents in Indonesia and Malaysia known as the triangulation validation method. Data were presented in narrative form. In Indonesia, forensic pathology and clinical forensic medicine were approached as one whereas in Malaysia separately. This practice was conducted by a general practitioner in collaboration with other specialists if needed in Indonesia; whereas, in Malaysia, this practice was conducted by forensic pathologists or medical officers in the absence of forensic pathologists. Both Indonesia and Malaysia followed the continental regimen in practicing clinical forensic medicine. There was still a lack of involvement of doctors in this field due to lack of understanding of clinical forensic medicine. The current clinical forensic medicine practice has not developed much and has no much difference in both countries. The gap between the current practice with systematic practice cannot be justified due to the absence of one standardized code of practice.

We present a novel approach to the separability problem for Gaussian quantum states of bosonic continuous variable systems. We derive a simplified necessary and sufficient separability criterion for arbitrary Gaussian states of m versus n modes, which relies on convex optimisation over marginal covariance matrices on one subsystem only. We further revisit the currently known results stating the equivalence between separability and positive partial transposition (PPT) for specific classes of Gaussian states. Using techniques based on matrix analysis, such as Schur complements and matrix means, we then provide a unified treatment and compact proofs of all these results. In particular, we recover the PPT-separability equivalence for: (i) Gaussian states of 1 versus n modes; and (ii) isotropic Gaussian states. In passing, we also retrieve (iii) the recently established equivalence between separability of a Gaussian state and and its complete Gaussian extendability. Our techniques are then applied to progress beyond the state of the art. We prove that: (iv) Gaussian states that are invariant under partial transposition are necessarily separable; (v) the PPT criterion is necessary and sufficient for separability for Gaussian states of m versus n modes that are symmetric under the exchange of any two modes belonging to one of the parties; and (vi) Gaussian states which remain PPT under passive optical operations can not be entangled by them either. This is not a foregone conclusion per se (since Gaussian bound entangled states do exist) and settles a question that had been left unanswered in the existing literature on the subject. This paper, enjoyable by both the quantum optics and the matrix analysis communities, overall delivers technical and conceptual advances which are likely to be useful for further applications in continuous variable quantum information theory, beyond the separability problem.

Much of what we know about the initiation of earthquakes comes from the temporal and spatial relationship of foreshocks to the initiation point of the mainshock. The 1999 Mw 7.6 Izmit, Turkey, earthquake was preceded by a 44 minute-long foreshock sequence. Bouchon et al. (Science, 2011) analyzed the foreshocks using a single seismic station, UCG, located to the north of the east-west fault, and concluded on the basis of waveform similarity that the foreshocks repeatedly re-ruptured the same fault patch, driven by slow slip at the base of the crust. We revisit the foreshock sequence using seismograms from 9 additional stations that recorded the four largest foreshocks (Mw 2.0 to 2.8) to better characterize spatial and temporal evolution of the foreshock sequence and their relationship to the mainshock hypocenter. Cross-correlation timing and hypocentroid location with hypoDD reveals a systematic west-to-east propagation of the four largest foreshocks toward the mainshock hypocenter. Foreshock rupture dimensions estimated using spectral ratios imply no major overlap for the first three foreshocks. The centroid of 4th and largest foreshock continues the eastward migration, but lies within the circular source area of the 3rd. The 3rd, however, has a low stress drop and strong directivity to the west . The mainshock hypocenter locates on the eastern edge of foreshock 4. We also re-analyzed waveform similarity of all 18 foreshocks recorded at UCG by removing the common mode signal and clustering the residual seismogram using the correlation coefficient as the distance metric. The smaller foreshocks cluster with the larger events in time order, sometimes as foreshocks and more commonly as aftershocks. These observations show that the Izmit foreshock sequence is consistent with a stress-transfer driven cascade, moving systematically to the east along the fault and that there is no observational requirement for creep as a driving mechanism.

Electron microanalysis in forensic practice ranks among basic applications used in investigation of traces (latents, stains, etc.) from crime scenes. Applying electron microscope allows for rapid screening and receiving initial information for a wide range of traces. SEM with EDS/WDS makes it possible to observe topography surface and morphology samples and examination of chemical components. Physical laboratory of the Institute of Criminalistics Prague use SEM especially for examination of inorganic samples, rarely for biology and other material. Recently, possibilities of electron microscopy have been extended considerably using dual systems with focused ion beam. These systems are applied mainly in study of inner micro and nanoparticles , thin layers (intersecting lines in graphical forensic examinations, analysis of layers of functional glass, etc.), study of alloys microdefects, creating 3D particles and aggregates models, etc. Automated mineralogical analyses are a great asset to analysis of mineral phases, particularly soils, similarly it holds for cathode luminescence, predominantly colour one and precise quantitative measurement of their spectral characteristics. Among latest innovations that are becoming to appear also at ordinary laboratories are TOF - SIMS systems and micro Raman spectroscopy with a resolution comparable to EDS/WDS analysis (capable of achieving similar level as through EDS/WDS analysis).

Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent–daughter pairs and the need for more precise half-life data is examined. - Highlights: • Uncertainty propagation formulae for age dating with nuclear chronometers. • Applied to parent–daughter pairs used in nuclear forensics. • Investigated need for better half-life data

Describes various scientific techniques used to analyze physical evidence, ten areas of specialization in forensic science, courses needed by forensic scientists, and the future of forensic science. (DS)

Full Text Available This paper explores the issues facing digital forensics in South Africa. It examines particular cyber threats and cyber threat levels for South Africa and the challenges in addressing the cybercrimes in the country through digital forensics. The paper paints a picture of the cybercrime threats facing South Africa and argues for the need to develop a skill base in digital forensics in order to counter the threats through detection of cybercrime, by analyzing cybercrime reports, consideration of current legislation, and an analysis of computer forensics course provision in South African universities. The paper argues that there is a need to develop digital forensics skills in South Africa through university programs, in addition to associated training courses. The intention in this paper is to promote debate and discussion in order to identify the cyber threats to South Africa and to encourage the development of a framework to counter the threats – through legislation, high tech law enforcement structures and protocols, digital forensics education, digital forensics skills development, and a public and business awareness of cybercrime threats.

Over the last several decades, forensic science---the application of science to civil and criminal legal matters---has become of increasing popularity with the public. The range of disciplines within the field is immense, offering individuals the potential for a unique career, regardless of their specific interests or expertise. In response to this growth, many organizations, both public and private, have recognized the need to create forensic science programs that strive to maintain and enhance the quality of forensic science education. Unfortunately, most of the emphasis placed on developing these materials relates to post-secondary education, and creates a significant lack of forensic science educational materials available in the U.S., especially in Oklahoma. The purpose of this project was to create a high school curriculum that provides the foundation for building a broad, yet comprehensive, overview of the field of forensic science and its associated disciplines. The overall goal was to create and provide course materials to high school teachers in order to increase their knowledge of forensic science such that they are able to teach its disciplines effectively and with accuracy. The Forensic Science Curriculum for High School Students includes sample lesson plans, PowerPoint presentations, and lab activities with step-by-step instructions.

The National Academy of Sciences (2009) published a review charting several key recommendations on strengthening the forensic sciences as an entity as part of an initiative put forth by the USA Congress to streamline and improve the quality of the forensic sciences and their impact on the judiciary process. Although the review was not totally inclusive, many of its sentiments have permeated into all the forensic sciences. The following paper is designed to determine who is practicing the science of forensic entomology, and in what capacity, by questioning practicing forensic entomologists about the type of education obtained, their countries' standards and accreditation processes, as well as general demographic information such as age and gender. A 28-question survey was sent out to 300 forensic entomologists worldwide in 2009. Of the 70 respondents, 80% had a formal education (either Masters or PhD), and 66% published their research. Approximately 50% of respondents were involved in the delivery of expert evidence and writing up case reports, and countries were actively involved with accrediting personnel, facilities, and entomology kits. Many discrepancies within the reported practices and accreditation processes highlight the need for the adoption of a standard code of practice among forensic entomologists. PMID:24219583

The National Academy of Sciences ( 2009 ) published a review charting several key recommendations on strengthening the forensic sciences as an entity as part of an initiative put forth by the USA Congress to streamline and improve the quality of the forensic sciences and their impact on the judiciary process. Although the review was not totally inclusive, many of its sentiments have permeated into all the forensic sciences. The following paper is designed to determine who is practicing the science of forensic entomology, and in what capacity, by questioning practicing forensic entomologists about the type of education obtained, their countries' standards and accreditation processes, as well as general demographic information such as age and gender. A 28-question survey was sent out to 300 forensic entomologists worldwide in 2009. Of the 70 respondents, 80% had a formal education (either Masters or PhD), and 66% published their research. Approximately 50% of respondents were involved in the delivery of expert evidence and writing up case reports, and countries were actively involved with accrediting personnel, facilities, and entomology kits. Many discrepancies within the reported practices and accreditation processes highlight the need for the adoption of a standard code of practice among forensic entomologists.

BGR seismologists often set up monitoring stations for testing purposes. The engineers from the Central Seismological Observatory have now developed a new type of mobile monitoring station which can be remotely controlled.

Nuclear forensic is an emerging and highly specialized discipline which deals with nuclear investigation and analysis of nuclear or radiological/radioactive materials. Nuclear Forensic analysis includes various methodology and analytical methods along with morphology, physical, chemical, elemental and isotopic analysis to characterize and develop nuclear database for the identification of unknown nuclear or radiological/radioactive material. The origin, source history, pathway and attribution of unknown radioactive/nuclear material is possible with certainty through Nuclear Forensics. Establishment of Nuclear Forensic Laboratory and development of expertise for nuclear investigation under one roof by developing the nuclear data base and laboratory network is need of the hour to ably address the problems of all the law enforcement and nuclear agencies. The present study provides insight in Nuclear Forensics and focuses on an urgent need for a comprehensive plan to set up Nuclear Forensic Laboratory across India. (author)

The seismological aspects of various proposed means of obscuring or hiding the seismic signatures of explosions from a surveillance network are discussed. These so-called evasion schemes are discussed from the points of view of both the evader and the monitor. The analysis will be conducted in terms of the USSR solely because that country is so vast and the geological/geophysical complexities of the country are so great that the complete spectrum of hypothesized evasion schemes requires discussion. Techniques appropriate for use when the seismic noise problem is interference due to codas of P and surface waves from earthquakes are described, and the capabilities of several seismological networks to restrain use of such codas for effective evasion are analyzed

The Research and Development (R and D) activities during 1984-1985 of the Seismology Section of the Bhabha Atomic Research Centre, Bombay are reported in the form of individual summaries. The R and D activities of the Section are directed towards development of seismological instruments and methods of analysis of the seismic field data with the main objective of detecting underground nuclear explosions and assessing seismicity and seismic risk of sites considered for nuclear power stations. The Section has two field stations - one at Gauribidanur in the Southern part of the country and another at Delhi i.e. in the northern part of the country. During the report period, a total of 62 events out of the detected ones were identified as underground explosions. The expertise of the Section is also made available for outside organisations. (M.G.B.)

This progress report for 1983 is the fourth yearly report summarizing the activities of the Division of Applied Seismology of the National Defence Research Institute (FOA) in Sweden. This division of the Institute is mainly involved in seismic discrimination and nuclear explosion monitoring. Special attention is paid in this report to the development of International Data Centers as a component of a global monitoring system. The division is also conducting a project on seismic risk estimation at nuclear power plants in Sweden. This project includes operating a network of local seismic stations in Sweden. Two seismic exploration projects are also currently being conducted. One project involves the further development of seismic methods for oil exploration, and the other the investigation of crystalline rock using seismic cross hole measurement. Finally the Division of Applied Seismology is conducting a project where seismic sensor in military applications are studied.

1. Forensic nurses frequently work in violent settings without regard for self-preservation to save the lives of injured individuals or investigate the deaths of deceased individuals. 2. Cases involving children and victims with disfiguring injuries, and incidents when their personal safety was compromised are most disturbing to forensic nurses. 3. Providing means for health care professionals to cope appropriately encourages healthy healing. 4. Forensic nurses must learn to self-assess and recognize the signs and symptoms associated with unhealthy coping, depression, or posttraumatic stress disorder.

Forensic entomology is the study of insects/arthropods in criminal investigation. Right from the early stages insects are attracted to the decomposing body and may lay eggs in it. By studying the insect population and the developing larval stages, forensic scientists can estimate the postmortem index, any change in position of the corpse as well as the cause of death. Forensic odontologists are called upon more frequently to collaborate in criminal investigations and hence should be aware of ...

Within the frame of international cooperation in the field of CTBT, this paper describes the first seismologic station established in Morocco in 1934, and in sixties and seventies another 15 stations after the earthquake in Agadir. In 1982, a system for seismic detection was installed having as main objectives he following: coordination and correlation of activities concerned with evaluation of seismic risks in the Mediterranean region, and integration of geophysical data needed for seismic risk assessment

Scientific understanding of earthquakes and their attendant hazards is vital for the development of effective earthquake risk reduction strategies. Within the global disaster reduction policy framework (the Hyogo Framework for Action, overseen by the UN International Strategy for Disaster Reduction), the anticipated role of science and scientists is clear, with respect to risk assessment, loss estimation, space-based observation, early warning and forecasting. The importance of information sharing and cooperation, cross-disciplinary networks and developing technical and institutional capacity for effective disaster management is also highlighted. In practice, the degree to which seismological information is successfully delivered to and applied by individuals, groups or organisations working to manage or reduce the risk from earthquakes is variable. The challenge for scientists is to provide fit-for-purpose information that can be integrated simply into decision-making and risk reduction activities at all levels of governance and at different geographic scales, often by a non-technical audience (i.e. people without any seismological/earthquake engineering training). The interface between seismological research and earthquake risk reduction (defined here in terms of both the relationship between the science and its application, and the scientist and other risk stakeholders) is complex. This complexity is a function of a range issues that arise relating to communication, multidisciplinary working, politics, organisational practices, inter-organisational collaboration, working practices, sectoral cultures, individual and organisational values, worldviews and expectations. These factors can present significant obstacles to scientific information being incorporated into the decision-making process. The purpose of this paper is to present some personal reflections on the nature of the interface between the worlds of seismological research and risk reduction, and the

Dynamic Topography Revisited Dynamic topography is usually considered to be one of the trinity of contributing causes to the Earth's non-hydrostatic topography along with the long-term elastic strength of the lithosphere and isostatic responses to density anomalies within the lithosphere. Dynamic topography, thought of this way, is what is left over when other sources of support have been eliminated. An alternate and explicit definition of dynamic topography is that deflection of the surface which is attributable to creeping viscous flow. The problem with the first definition of dynamic topography is 1) that the lithosphere is almost certainly a visco-elastic / brittle layer with no absolute boundary between flowing and static regions, and 2) the lithosphere is, a thermal / compositional boundary layer in which some buoyancy is attributable to immutable, intrinsic density variations and some is due to thermal anomalies which are coupled to the flow. In each case, it is difficult to draw a sharp line between each contribution to the overall topography. The second definition of dynamic topography does seem cleaner / more precise but it suffers from the problem that it is not measurable in practice. On the other hand, this approach has resulted in a rich literature concerning the analysis of large scale geoid and topography and the relation to buoyancy and mechanical properties of the Earth [e.g. refs 1,2,3] In convection models with viscous, elastic, brittle rheology and compositional buoyancy, however, it is possible to examine how the surface topography (and geoid) are supported and how different ways of interpreting the "observable" fields introduce different biases. This is what we will do. References (a.k.a. homework) [1] Hager, B. H., R. W. Clayton, M. A. Richards, R. P. Comer, and A. M. Dziewonski (1985), Lower mantle heterogeneity, dynamic topography and the geoid, Nature, 313(6003), 541-545, doi:10.1038/313541a0. [2] Parsons, B., and S. Daly (1983), The

Full text: The Exhibit Handling System, operated by the Anti-Terrorist Branch, has evolved from experiences whilst dealing with long term domestic terrorism and the subsequent prosecution of the offenders. Stringent U.K. criminal law in regard to exhibits and forensic evidence required a strict system in order to provide continuity and integrity to every item that came into possession of the Police. This system also applies to items that are eventually deemed 'unused', as nearly all evidence is disclosed to the defence. I believe that if a system can withstand the close examination that British Criminal Law provides, it will probably be suitable in most countries. The system relies on each item being supplied with a documented trail of all persons who have had possession of it and who have opened the security packaging for examination purposes. In contaminated environments the initial process within the system has to be adapted in order that strict monitoring of the items can be carried out during the packaging process. It is also recognized that access to many exhibits will be heavily restricted and therefore protocols are in place to interrogate the evidence at the packaging stage in order to avoid unnecessary spread of contamination. The protocols are similar for both radiological and nuclear incidents as well as chemical and biological. Regardless of the type of incident the system can be adapted on the advice of the relevant scientific authority. In the U.K. for radiological and nuclear incidents that authority would be the A.W.E. Aldermaston. The integrity and continuity regime should be continued within laboratories which are conducting examinations of exhibits recovered. It is also important that Nuclear Forensic Laboratories do not overlook possibilities of traditional evidence, such as DNA, Fingerprints and fibre traces. Good record photography of items which are unlikely to be released by the laboratory is essential. Finally, cross-contamination has in

Tourism is an experience-intensive sector in which customers seek and pay for experiences above everything else. Remembering past tourism experiences is also crucial for an understanding of the present, including the predicted behaviours of visitors to tourist destinations. We adopt a longitudinal...... approach to memory data collection from psychological science, which has the potential to contribute to our understanding of tourist behaviour. In this study, we examine the impact of remembered tourist experiences in a safari park. In particular, using matched survey data collected longitudinally and PLS...... path modelling, we examine the impact of positive affect tourist experiences on the development of revisit intentions. We find that longer-term remembered experiences have the strongest impact on revisit intentions, more so than predicted or immediate memory after an event. We also find that remembered...

The purpose of law is to prevent the society from harm by declaring what conduct is criminal, and prescribing the punishment to be imposed for such conduct. The pervasiveness of the internet and its anonymous nature make cyberspace a lawless frontier where anarchy prevails. Historically, economic value has been assigned to visible and tangible assets. With the increasing appreciation that intangible data disseminated through an intangible medium can possess economic value, cybercrime is also being recognized as an economic asset. The Cybercrime, Digital Forensics and Jurisdiction disseminate knowledge for everyone involved with understanding and preventing cybercrime - business entities, private citizens, and government agencies. The book is firmly rooted in the law demonstrating that a viable strategy to confront cybercrime must be international in scope.

or Illicit Trafficking of Radioactive Material (IAEA-TECDOC-1313). It was quickly recognized that much can be learned from the analysis of reported cases of illicit trafficking. For example, what specifically could the material have been used for? Where was the material obtained: in stock, scrap or waste? Was the amount seized only a sample of a much more significant quantity? These and many other questions can be answered through detailed technical characterization of seized material samples. The combination of scientific methods used for this purpose is normally referred to as 'nuclear forensics', which has become an indispensable tool for use in law enforcement investigations of nuclear trafficking. This publication is based on a document entitled Model Action Plan for Nuclear Forensics and Nuclear Attribution (UCLR-TR-202675). The document is unique in that it brings together, for the first time, a concise but comprehensive description of the various tools and procedures of nuclear forensic investigations that was earlier available only in different areas of the scientific literature. It also has the merit of incorporating experience accumulated over the past decade by law enforcement agencies and nuclear forensics laboratories confronted with cases of illicit events involving nuclear or other radioactive material

Full Text Available Cloud Computing is a rather new technology which has the goal of efficiently usage of datacenter resources and offers them to the users on a pay per use model. In this equation we need to know exactly where and how a piece of information is stored or processed. In today's cloud deployments this task is becoming more and more a necessity and a must because we need a way to monitor user activity, and furthermore, in case of legal actions, we must be able to present digital evidence in a way in which it is accepted. In this paper we are going to present a modular and distributed architecture that can be used to implement a cloud digital forensics framework on top of new or existing datacenters.

Traditionally law enforcement agencies have relied on basic measurement and imaging tools, such as tape measures and cameras, in recording a crime scene. A disadvantage of these methods is that they are slow and cumbersome. The development of a portable system that can rapidly record a crime scene with current camera imaging, 3D geometric surface maps, and contribute quantitative measurements such as accurate relative positioning of crime scene objects, would be an asset to law enforcement agents in collecting and recording significant forensic data. The purpose of this project is to develop a feasible prototype of a fast, accurate, 3D measurement and imaging system that would support law enforcement agents to quickly document and accurately record a crime scene

Mutual fund manager excess performance should be measured relative to their self-reported benchmark rather than the return of a passive portfolio with the same risk characteristics. Ignoring the self-reported benchmark introduces biases in the measurement of stock selection and timing components of excess performance. We revisit baseline empirical evidence in mutual fund performance evaluation utilizing stock selection and timing measures that address these biases. We introduce a new factor e...

mental health professionals burdened with large numbers of patients.6. In most ... by the courts for forensic assessments and testimony in the region. .... A comparison of risk factors for habitual violence in pre- ... Homicide and suicide in Benin-.

Full Text Available In any mass disaster condition, identification of the person is most important. For this purpose, the forensic investigators use different methods for identifying the dead. They consider skeletal remains of the dead as the initial step in identification. Radiographs carry great evidence to act as antemortem records and also assist in identifying the person, age, gender, race, etc. Forensic dentistry is also emerging as a new branch in forensics. So, the forensic dentist must be aware of different techniques, developments, and resources to incorporate the technology in order to achieve success in human identification. So, our aim of the present review is to focus on different radiological techniques and new developments available for successful identification of the dead.

Forensic dentistry has become an integral part of forensic science over the past 100 years that utilizes dental or oro-facial findings to serve the judicial system. This has been due to the dedication of people like Gustafson's, Keiser-Nielson, and Suzuki for this field. They established the essential role which forensic dentistry plays mainly in the identification of human remains. The tooth has been used as weapons and under certain circumstances, may leave information about the identity of the biter. Dental professionals have a major role to play in keeping accurate dental records and providing all necessary information so that legal authorities may recognize mal practice, negligence, fraud or abuse, and identity of unknown individuals. This paper will try to summarize the various roles of dental experts in forensic medicine. PMID:25298709

findings of the study show that forensic accounting is significant in the face of the increasing ... and the more sophisticated Information. Communication. Technology. (ICT), .... (9) studied the impact ... financial reporting and internal control.

Full text: University of Surrey has, for the past four years, collaborated with police institutions from across Europe and the rest of the world lo scope potential applications of ion beam analysis (IBA) in forensic science. In doing this we have consulted practitioners across a range of forensic disciplines, and critically compared IBA with conventional characterisation techniques to investigate the areas in which IBA can add evidential value. In this talk, the results of this feasibility study will be presented, showing the types of sample for which IBA shows considerable promise. We will show how a combination of PIXE with other IBA techniques (EBS, PIGE, MeV-SIMS) can be used to give unprecedented characterisation of forensic samples and comment on the significance of these results for forensic casework. We will also show cases where IBA not appear to add any significant improvement over conventional techniques. (author)

The prevalence and profile of adults with a history of traumatic brain injury (TBI) has not been studied in large North American forensic mental health populations. This study investigated how adults with a documented history of TBI differed with the non-TBI forensic population with respect to demographics, psychiatric diagnoses and history of offences. A retrospective chart review of all consecutive admissions to a forensic psychiatry programme in Toronto, Canada was conducted. Information on history of TBI, psychiatric diagnoses, living environments and types of criminal offences were obtained from medical records. History of TBI was ascertained in 23% of 394 eligible patient records. Compared to those without a documented history of TBI, persons with this history were less likely to be diagnosed with schizophrenia but more likely to have alcohol/substance abuse disorder. There were also differences observed with respect to offence profiles. This study provides evidence to support routine screening for a history of TBI in forensic psychiatry.

SeisCode is a community repository for software used in seismological and related fields. The repository is intended to increase discoverability of such software and to provide a long-term home for software projects. Other places exist where seismological software may be found, but none meet the requirements necessary for an always current, easy to search, well documented, and citable resource for projects. Organizations such as IRIS, ORFEUS, and the USGS have websites with lists of available or contributed seismological software. Since the authors themselves do often not maintain these lists, the documentation often consists of a sentence or paragraph, and the available software may be outdated. Repositories such as GoogleCode and SourceForge, which are directly maintained by the authors, provide version control and issue tracking but do not provide a unified way of locating geophysical software scattered in and among countless unrelated projects. Additionally, projects are hosted at language-specific sites such as Mathworks and PyPI, in FTP directories, and in websites strewn across the Web. Search engines are only partially effective discovery tools, as the desired software is often hidden deep within the results. SeisCode provides software authors a place to present their software, codes, scripts, tutorials, and examples to the seismological community. Authors can choose their own level of involvement. At one end of the spectrum, the author might simply create a web page that points to an existing site. At the other extreme, an author may choose to leverage the many tools provided by SeisCode, such as a source code management tool with integrated issue tracking, forums, news feeds, downloads, wikis, and more. For software development projects with multiple authors, SeisCode can also be used as a central site for collaboration. SeisCode provides the community with an easy way to discover software, while providing authors a way to build a community around their

Multimedia Security: Watermarking, Steganography, and Forensics outlines essential principles, technical information, and expert insights on multimedia security technology used to prove that content is authentic and has not been altered. Illustrating the need for improved content security as the Internet and digital multimedia applications rapidly evolve, this book presents a wealth of everyday protection application examples in fields including multimedia mining and classification, digital watermarking, steganography, and digital forensics. Giving readers an in-depth overview of different asp

Over the past several years, the Livermore Forensic Science Center has conducted analyses of nuclear-related samples in conjunction with domestic and international criminal investigations. Law enforcement officials have sought conventional and nuclear-forensic analyses of questioned specimens that have typically consisted of miscellaneous metal species or actinide salts. The investigated activities have included nuclear smuggling and the proliferation of alleged fissionable materials, nonradioactive hoaxes such as 'Red Mercury', and the interdiction of illegal laboratories engaged in methamphetamine synthesis. (author)

PBX Security and Forensics begins with an introduction to PBXs (Private Branch Exchanges) and the scene, statistics and involved actors. This book discusses confidentiality, integrity and availability threats in PBXs. The author examines the threats and the technical background as well as security and Forensics involving PBXs. The purpose of this book is to raise user awareness in regards to security and privacy threats present in PBXs, helping both users and administrators safeguard their systems.

All biological living beings inevitably die, and the ways to die vary although in essence death is a manifestation of the absence of Oxygen in the brain. After death, biological remains undertake proteolysis and decomposition. The aim of this article is to discuss clinical death, cerebral or medicolegal death, social death, phases of cerebral death, and biological process after deathâ€”which is important for forensic medicine and forensic anthropology. How long a person die, if the time elaps...

This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

Seismology is a geoscience often perceived by uninstructed broad audiences as unreliable or inconsistent, since it cannot predict future earthquakes or warn about them effectively; this criticism disregards important achievements that seismology has offered during its more than 100 years of history - such as evidence of Earth's inner structure, knowledge regarding plate tectonics, mineral resource identification, contributions to risk mitigation, monitoring of explosions etc. Moreover, seismology is a field of study with significant advances, which make (or could make) living much safer, in areas with high seismic hazard. We mentioned "could make" since people often fail to understand an important aspect: seismology offers consistent knowledge regarding how to prepare, construct or behave - but it's up to people and authorities to implement the effective measures. In all this story, the effective communication between scientists and the general public plays a major role, making the leap from misconception to relevant impact. As scientists, we wanted to show the true meaning and purpose of seismology to all categories of people. We are in the final stage of the MOBEE (MOBile Earthquake Exhibition) Project implementation, an innovative initiative in a highly seismic country (Romania), where major Vrancea intermediate-depth earthquakes source have the potential to generate a significant amount of damage over large areas; however, unlike countries like Japan, the medium to long period between felt or significant events (20-40 years) is long enough to make the newer generation in Romania disregardful of the hazard, and older generations skeptical about the role of seismology. MOBEE intended to freshen up things, raise awareness and change the overall perception - through new approaches involving a blend of digital content (interactive apps, responsive and continuously updated website), 3D models achieved through new technologies (3D printing, fiber optics), non

Serological and biochemical identification methods used in forensics have several major disadvantages, such as: long time in processing biological sample and lack of sensitivity and specificity. In the last 30 years, DNA molecular analysis has become an important tool in forensic investigations. DNA profiling is based on the short tandem repeats (STR) and aids in human identification from biological samples. Forensic genetics, can provide information on the events which occurred at the crime scene or to supplement other methods of forensic identification. Currently, the methods used in identification are based on polymerase chain reaction (PCR) analyses. This method analyses the autosomal STRs, the Y-chromosome, and the mitochondrial DNA. Correlation of biological samples present at the crime scene with identification, selection, and the probative value factor is therefore the first aspect to be taken into consideration in the forensic genetic analysis. In the last decade, because of the advances in the field of molecular biology, new biomarkers such as: microRNAs (miR), messenger RNA (mRNA), and DNA methylation have been studied and proposed to be used in the forensic identifications of body fluids.

Producing molecular imprinting-based materials has received increasing attention due to recognition selectivity, stability, cast effectiveness, and ease of production in various forms for a wide range of applications. The molecular imprinting technique has a variety of applications in the areas of the food industry, environmental monitoring, and medicine for diverse purposes like sample pretreatment, sensing, and separation/purification. A versatile usage, stability and recognition capabilities also make them perfect candidates for use in forensic sciences. Forensic science is a demanding area and there is a growing interest in molecularly imprinted polymers (MIPs) in this field. In this review, recent molecular imprinting applications in the related areas of forensic sciences are discussed while considering the literature of last two decades. Not only direct forensic applications but also studies of possible forensic value were taken into account like illicit drugs, banned sport drugs, effective toxins and chemical warfare agents in a review of over 100 articles. The literature was classified according to targets, material shapes, production strategies, detection method, and instrumentation. We aimed to summarize the current applications of MIPs in forensic science and put forth a projection of their potential uses as promising alternatives for benchmark competitors.

This research project was performed to assist the Faculty of Forensic and Legal Medicine (FFLM) with the development of a training programme for Principal Forensic Physicians (PFPs) (Since this research was performed the Metropolitan Police Service have dispensed with the services of the Principal Forensic Physicians so currently (as of January 2009) there is no supervision of newly appointed FMEs or the development training of doctors working in London nor any audit or appraisal reviews.) to fulfil their role as educational supervisors. PFPs working in London were surveyed by questionnaire to identify the extent of their knowledge with regard to their role in the development training of all forensic physicians (FPs) in their group, the induction of assistant FPs and their perceptions of their own training needs with regard to their educational role. A focus group was held at the FFLM annual conference to discuss areas of interest that arose from the preliminary results of the questionnaire. There is a clear need for the FFLM to set up a training programme for educational supervisors in clinical forensic medicine, especially with regard to appraisal. 2009 Elsevier Ltd and Faculty of Forensic and Legal Medicine.

The place and function of forensic sciences personnel in American criminal law and court procedure, and the criteria used by criminal trial judges and lawyers to assess the value of forensic sciences personnel were investigated. Federal, state, Virgin Island, and Puerto Rican laws were examined, and a search of the medical and legal literature…

In this paper the insights and results are presented of a long term and ongoing improvement effort within the Netherlands Forensic Institute (NFI) to establish a valuable innovation programme. From the overall perspective of the role and use of forensic science in the criminal justice system, the

Full Text Available digital forensics is seen as not only a counterproposal but as a solution to the rapid increase of cyber crime in WLANs. The key issue impacting WLAN digital forensics is that, it is an enormous challenge to intercept and preserve all the communications...

Data from available literature point to an early beginning of Forensic Dentistry in Croatia relating to a post-mortem examination of a female patient after a dental procedure in the 1930s. Later on, there were several mass casualties due to collisions and airplane crashes and a railway accident at the Zagreb Main Railway Station wherein the identity of the victims was established based on dental features. Foreign experts in forensics helped identify those victims, particularly forensic dentists because this specialty was almost unknown in our region at the time. During the twenty-year period of the development of Forensic Dentistry at the University of Zagreb, the School of Dental Medicine, the city of Zagreb and Croatia have become internationally recognised on the forensic map of the world.

Full Text Available Sammons, John. (2012. The Basics of Digital Forensics: The Primer for Getting Started in Digital Forensics. Waltham, MA: Syngress, 208 pages, Print Book ISBN: 9781597496612.eBook ISBN : 9781597496629. Print: US $29.95. eBook: US$20.97. Includes exercises, case studies, references, and index.Reviewed by Stephen Larson, PhD. Assistant Professor, Slippery Rock University of PAThe Basics of Digital Forensics: The Primer for Getting Started in Digital Forensics is well-named–it really is very basic. And it should be, as the book’s intended audience includes entry-level digital forensics professionals and complimentary fields such as law enforcement, legal, and general information security. Though the copyright is 2012, some of the data is from 2009, and there is mention of estimates for 2010.(see PDF for full review

To understand better the skills and competencies for forensic and non-forensic nursing of psychopathic and personality disordered patients. In the UK, there has been growing interest in service provision for this client group, but with little research to support the nursing skills required. A non-experimental design, using a postal survey to 990 forensic and 500 non-forensic nurses. An information gathering schedule was used to generate data about the most desirable skills and competencies and least desirable weaknesses and nursing attributes to nurse this group. The results for the forensic nurses. Main strengths and skills: being firm, setting limits and defining boundaries. Main weaknesses: inability to engage, inability to resolve conflict and impatience. Main skills and competencies: being non-threatening, non-judgemental and able to expect anything. Least desirable qualities: over-reacting, being judgemental and over-confrontational. The results for the non-forensic nurses. Main strengths and skills: being non-judgemental, listening skills and good risk assessment. Main weaknesses: frustration with the system, a fear of aggression and no skills to engage. Main skills and competencies: being open-minded, non-judgemental and forming relationships. Least desirable qualities: a supercilious attitude, cynicism and being judgemental. The results highlight the importance of forming therapeutic relationships as the bedrock of both forensic and non-forensic nursing, and they also highlight the important differences with regard to the significance of therapeutic action and therapeutic verbal interaction. The provision of better care for this client group will rely on appropriate training for nurses. This research highlights the need for training that supports the development of engagement skills, communication skills and an ability to use reflection in action as a means of providing therapeutic care. It also highlights the different emphasis on the use of these skills

Forensic anthropology (in Lithuania, as everywhere in Eastern Europe, traditionally considered as a narrower field--forensic osteology) has a long history, experience being gained both during exhumations of mass killings during the Second World War and the subsequent totalitarian regime, investigations of historical mass graves, identification of historical personalities and routine forensic work. Experts of this field (usually a branch of forensic medicine) routinely are solving "technical" questions of crime investigation, particularly identification of (usually dead) individuals. Practical implementation of the mission of forensic anthropology is not an easy task due to interdisciplinary character of the field. On one hand, physical anthropology has in its disposition numerous scientifically tested methods, however, their practical value in particular legal processes is limited. Reasons for these discrepancies can be related both to insufficient understanding of possibilities and limitations of forensic anthropology and archaeology by officials representing legal institutions that perform investigations, and sometimes too "academic" research, that is conducted at anthropological laboratories, when methods developed are not completely relevant to practical needs. Besides of answering to direct questions (number of individuals, sex, age, stature, population affinity, individual traits, evidence of violence), important humanitarian aspects--the individual's right for identity, the right of the relatives to know the fate of their beloved ones--should not be neglected. Practical use of other identification methods faces difficulties of their own (e.g., odontology--lack of regular dental registration system and compatible database). Two examples of forensic anthropological work of mass graves, even when the results were much influenced by the questions raised by investigators, can serve as an illustration of the above-mentioned issues.

In the last few years impressive achievements have been made in improving inferences about earthquake sources by using InSAR (Interferometric Synthetic Aperture Radar) data. Several factors aided these developments. The open data basis of earthquake observations has expanded vastly with the two powerful Sentinel-1 SAR sensors up in space. Increasing computer power allows processing of large data sets for more detailed source models. Moreover, data inversion approaches for earthquake source inferences are becoming more advanced. By now data error propagation is widely implemented and the estimation of model uncertainties is a regular feature of reported optimum earthquake source models. Also, more regularly InSAR-derived surface displacements and seismological waveforms are combined, which requires finite rupture models instead of point-source approximations and layered medium models instead of homogeneous half-spaces. In other words the disciplinary differences in geodetic and seismological earthquake source modelling shrink towards common source-medium descriptions and a source near-field/far-field data point of view. We explore and facilitate the combination of InSAR-derived near-field static surface displacement maps and dynamic far-field seismological waveform data for global earthquake source inferences. We join in the community efforts with the particular goal to improve crustal earthquake source inferences in generally not well instrumented areas, where often only the global backbone observations of earthquakes are available provided by seismological broadband sensor networks and, since recently, by Sentinel-1 SAR acquisitions. We present our work on modelling standards for the combination of static and dynamic surface displacements in the source's near-field and far-field, e.g. on data and prediction error estimations as well as model uncertainty estimation. Rectangular dislocations and moment-tensor point sources are exchanged by simple planar finite

Using the full-length records of seismic events and background ambient noise, today seismology is going beyond still-life snapshots of the interior of the Earth, and look into time-dependent changes of its properties. Data availability has grown dramatically with the expansion of seismographic networks and data centers, so as to enable much more detailed and accurate analyses. COST Action ES1401 TIDES (TIme DEpendent Seismology; http://tides-cost.eu) aims at structuring the EU seismological community to enable development of data-intensive, time-dependent techniques for monitoring Earth active processes (e.g., earthquakes, volcanic eruptions, landslides, glacial earthquakes) as well as oil/gas reservoirs. The main structure of TIDES is organised around working groups on: Workflow integration of data and computing resources; Seismic interferometry and ambient noise; Forward problems and High-performance computing applications; Seismic tomography, full waveform inversion and uncertainties; Applications in the natural environment and industry. TIDES is an open network of European laboratories with complementary skills, and is organising a series of events - workshops and advanced training schools - as well as supporting short-duration scientific stays. The first advanced training school was held in Bertinoro (Italy) on June 2015, with attendance of about 100 participants from 20 European countries, was devoted to how to manage and model seismic data with modern tools. The next school, devoted to ambient noise, will be held in 2016 Portugal: the program will be announced at the time of this conference. TIDES will strengthen Europe's role in a critical field for natural hazards and natural resource management.

Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

As a founding member of the CoopEUS initiative, IRIS Data Services has partnered with five data centers in Europe and the UC Berkeley (NCEDC) in the US to implement internationally standardized web services to access seismological data using identical methodologies. The International Federation of Digital Seismograph Networks (FDSN) holds commission status within IASPEI/IUGG and as such is the international body that governs data exchange formats and access protocols within seismology. The CoopEUS project involves IRIS and UNAVCO as part of the EarthScope project and the European collaborators are all members of the European Plate Observing System (EPOS). CoopEUS includes one work package that attempts to coordinate data access between EarthScope and EPOS facilities. IRIS has worked with its partners in the FDSN to develop and adopt three key international service standards within seismology. These include 1) fdsn-dataselect, a service that returns time series data in a variety of standard formats, 2) fdsn-station, a service that returns related metadata about a seismic station in stationXML format, and 3) fdsn-event, a service that returns information about earthquakes and other seismic events in QuakeML format. Currently the 5 European data centers supporting these services include the ORFEUS Data Centre in the Netherlands, the GFZ German Research Centre for Geosciences in Potsdam, Germany, ETH Zurich in Switzerland, INGV in Rome, Italy, and the RESIF Data Centre in Grenoble France. Presently these seven centres can all be accessed using standardized web services with identical service calls and returns results in standardized ways. IRIS is developing an IRIS federator that will allow a client to seamlessly access information across the federated centers. Details and current status of the IRIS Federator will be presented.

QuakeML is an XML-based data exchange format for seismology that is under development. Current collaborators are from ETH, GFZ, USC, USGS, IRIS DMC, EMSC, ORFEUS, and ISTI. QuakeML development was motivated by the lack of a widely accepted and well-documented data format that is applicable to a broad range of fields in seismology. The development team brings together expertise from communities dealing with analysis and creation of earthquake catalogs, distribution of seismic bulletins, and real-time processing of seismic data. Efforts to merge QuakeML with existing XML dialects are under way. The first release of QuakeML will cover a basic description of seismic events including picks, arrivals, amplitudes, magnitudes, origins, focal mechanisms, and moment tensors. Further extensions are in progress or planned, e.g., for macroseismic information, location probability density functions, slip distributions, and ground motion information. The QuakeML language definition is supplemented by a concept to provide resource metadata and facilitate metadata exchange between distributed data providers. For that purpose, we introduce unique, location-independent identifiers of seismological resources. As an application of QuakeML, ETH Zurich currently develops a Python-based seismicity analysis toolkit as a contribution to CSEP (Collaboratory for the Study of Earthquake Predictability). We follow a collaborative and transparent development approach along the lines of the procedures of the World Wide Web Consortium (W3C). QuakeML currently is in working draft status. The standard description will be subjected to a public Request for Comments (RFC) process and eventually reach the status of a recommendation. QuakeML can be found at http://www.quakeml.org.

Full Text Available QuakeML is an XML-based data exchange standard for seismology that is in its fourth year of active community-driven development. Its development was motivated by the need to consolidate existing data formats for applications in statistical seismology, as well as setting a cutting-edge, community-agreed standard to foster interoperability of distributed infrastructures. The current release (version 1.2 is based on a public Request for Comments process and accounts for suggestions and comments provided by a broad international user community. QuakeML is designed as an umbrella schema under which several sub-packages are collected. The present scope of QuakeML 1.2 covers a basic description of seismic events including picks, arrivals, amplitudes, magnitudes, origins, focal mechanisms, and moment tensors. Work on additional packages (macroseismic information, ground motion, seismic inventory, and resource metadata has been started, but is at an early stage. Several applications based on the QuakeML data model have been created so far. Among these are earthquake catalog web services at the European Mediterranean Seismological Centre (EMSC, GNS Science, and the Southern California Earthquake Data Center (SCEDC, and QuakePy, an open-source Python-based seismicity analysis toolkit. Furthermore, QuakeML is being used in the SeisComP3 system from GFZ Potsdam, and in the Collaboratory for the Study of Earthquake Predictability (CSEP testing center installations, developed by Southern California Earthquake Center (SCEC. QuakeML is still under active and dynamic development. Further contributions from the community are crucial to its success and are highly welcome.

: The positive impact of IRIS, through its programs (GSN, PASSCAL, DMS, EO) and its workshops, on seismological research and community building can hardly be overestimated. The Data Management System has been very successful in bringing data to users for research and education anywhere in the world; it enables routine, and in many cases real time, analysis of massive amounts of waveform data for a spectacularly diverse range of studies. (I will give examples of surface wave tomography and inverse scattering studies of the core mantle boundary.) The support that PASSCAL provides for the planning and execution of field campaigns allows seismologists to shift attention from operational issues to exciting science, and the required data dissemination through DMS does not only result in tremendously valuable data sets but also contributes to community building through (international) collaboration. Europe, Australia, and Asia also have rich histories of network and portable array seismometry, and in many areas the cumulative station density exceeds that of North America (even, perhaps, with USArray). Moreover, in some cases, such as the use of temporary, roving arrays of broad band seismometers, activities overseas may have preceded and inspired developments in the US. However, the absence of effective central systems for management and dissemination of quality-controlled data has left many unique historical and regional data sets underutilized. This situation is changing, however. As an example I will mention the NERIES initiative to build a better infrastructure for seismological research and education in Europe. Apart from providing an example, through international collaboration IRIS can continue to play an important role in the improvement of the global seismological infrastructure.

Since seismicity in Japan is fairly high, Japanese interest in historical seismicity can be traced back to the nineth century, only a few centuries after the formation of the ancient ruling state. A 1000 years later, 2 years earlier than the modern seismological society was founded, the research on historical seismology started in Japan in 1878. By the accumulation for the recent 140 years, the present Japanese seismologists can read many historical materials without reading cursive scripts. We have a convenient access to the historical information related to earthquakes, in the modern characters of 27,759 pages. We now have 214 epicenters of historical earthquakes from 599 ad to 1872. Among them, 134 events in the early modern period were assigned hypocentral depths and proper magnitudes. The intensity data of 8700 places by those events were estimated. These precise intensity data enabled us to compare the detailed source areas of pairs of repeated historical earthquakes, such as the 1703 Genroku earthquake with the 1923 Kanto earthquake, and the 1707 Hoei earthquake with the summation of the 1854 Ansei Tokai and Ansei Nankai earthquakes. It is revealed that the focal area of the former larger event cannot completely include those of the latter smaller earthquakes, although those were believed to be typical sets of characteristic interplate earthquakes at the Sagami trough and at the Nankai trough. Research on historical earthquakes is very important to assess the seismic hazard in the future. We still have one-fifth events of the early modern period to be analyzed in detail. The compilation of places experienced high intensities in the modern events is also necessary. For the ancient and medieval periods, many equivocal events are still left. The further advance of the interdisciplinary research on historical seismology is necessary.

The Javakheti Highland located in the border region between Armenia and Georgia (sharing a border with Turkey) is an area in the Southern Caucasus of young Holocene-Quaternary volcanism and a region with convergence of a number of active faults. Issues related to the geometry, kinematics and slip-rate of these faults and assessment of their seismic hazard remain unclear in part due to the fragmentary nature of the studies carried out soley within the borders of each of the countries as opposed to region wide. In the frame of the ISTC A-1418 Project "Open network of scientific Centers for mitigation risk of natural hazards in the Southern Caucasus and Central Asia" the Javakheti Highland was selected as a trans-border test-zone. This designation allowed for the expansion and upgrading of the seismological and geodynamic monitoring networks under the auspices of several international projects (ISTC CSP-053 Project "Development of Communication System for seismic hazard situations in the Southern Caucasus and Central Asia", NATO SfP- 983284 Project "Caucasus Seismic Emergency Response") as well as through joint research programs with the National Taiwan University and Institute of Earth Sciences (IES, Taiwan), Universite Montpellier II (France) and Ecole et Observatoire des Sciences de la Terre-Université de Strasbourg (France). Studies of geodynamic processes, and seismicity of the region and their interaction have been carried out utilizing the newly established seismological and geodynamic monitoring networks and have served as a basis for the study of the geologic and tectonic structure . Upgrading and expansion of seismological and geodynamic networks required urgent solutions to the following tasks: Introduction of efficient online systems for information acquisition, accumulation and transmission (including sattelite systems) from permanent and temporary installed stations, Adoption of international standards for organization and management of databases in GIS

We present an analytical, seismologically consistent expression for the surface area of the region within which most landslides triggered by an earthquake are located (landslide distribution area). This expression is based on scaling laws relating seismic moment, source depth, and focal mechanism with ground shaking and fault rupture length and assumes a globally constant threshold of acceleration for onset of systematic mass wasting. The seismological assumptions are identical to those recently used to propose a seismologically consistent expression for the total volume and area of landslides triggered by an earthquake. To test the accuracy of the model we gathered geophysical information and estimates of the landslide distribution area for 83 earthquakes. To reduce uncertainties and inconsistencies in the estimation of the landslide distribution area, we propose an objective definition based on the shortest distance from the seismic wave emission line containing 95 % of the total landslide area. Without any empirical calibration the model explains 56 % of the variance in our dataset, and predicts 35 to 49 out of 83 cases within a factor of 2, depending on how we account for uncertainties on the seismic source depth. For most cases with comprehensive landslide inventories we show that our prediction compares well with the smallest region around the fault containing 95 % of the total landslide area. Aspects ignored by the model that could explain the residuals include local variations of the threshold of acceleration and processes modulating the surface ground shaking, such as the distribution of seismic energy release on the fault plane, the dynamic stress drop, and rupture directivity. Nevertheless, its simplicity and first-order accuracy suggest that the model can yield plausible and useful estimates of the landslide distribution area in near-real time, with earthquake parameters issued by standard detection routines.

Discussions of nuclear forensics are often restricted to work performed by radio-chemists measuring nuclear material attributes in the laboratory. However, this represents only one portion of the work required to answer critical questions. Laboratory analysis results in measurements that need to be evaluated. The results of those evaluations must be put into their proper context in order for them to be useful to others and often require merging those results with additional information. This may contribute to attribution, by virtue of inclusion or exclusion. Finally, the end product must be presented such that appropriate actions can be taken. This could include prosecution by law enforcement, policy initiatives on the part of legislative bodies, or military action in the case of nuclear attack (whether that attack is preempted or not). Using the discovery of a sample of plutonium during cleanup activities at Hanford in 2004, we will step through the process of discovery (representing an interdiction), initial field analysis, laboratory analysis, data evaluation and merging with additional data (similar to law enforcement and/or all source), thereby providing an example of an integrated approach.

The Scientific Working Group on Forensic Analysis of Chemical Terrorism (SWGFACT) has developed the following guidelines for laboratories engaged in the forensic analysis of chemical evidence associated with terrorism. This document provides a baseline framework and guidance for...

Full Text Available Many in digital forensics seem to forget that the science part of digital forensics means experimentation and that implies a whole lot of things that most practitioners never learned.(see PDF for full column

The most exciting initiative for the recent polar studies was the International Polar Year (IPY) in 2007-2008. The IPY has witnessed a growing community of seismologists who have made considerable efforts to acquire high-quality data in polar regions. It also provided an excellent opportunity to make significant advances in seismic instrumentation of the polar regions to achieve scientific targets involving global issues. Taking these aspects into account, we organize and publish a special issue in Polar Science on the recent advance in polar seismology and cryoseismology as fruitful achievements of the IPY.

The EC Research infrastructure project NERIES, an Integrated Infrastructure Initiative in seismology for 2006-2010 has passed its mid-term point. We will present a short concise overview of the current state of the project, established cooperation with other European and global projects and the planning for the last year of the project. Earthquake data archiving and access within Europe has dramatically improved during the last two years. This concerns earthquake parameters, digital broadband and acceleration waveforms and historical data. The Virtual European Broadband Seismic Network (VEBSN) consists currently of more then 300 stations. A new distributed data archive concept, the European Integrated Waveform Data Archive (EIDA), has been implemented in Europe connecting the larger European seismological waveform data. Global standards for earthquake parameter data (QuakeML) and tomography models have been developed and are being established. Web application technology has been and is being developed to make a jump start to the next generation data services. A NERIES data portal provides a number of services testing the potential capacities of new open-source web technologies. Data application tools like shakemaps, lossmaps, site response estimation and tools for data processing and visualisation are currently available, although some of these tools are still in an alpha version. A European tomography reference model will be discussed at a special workshop in June 2009. Shakemaps, coherent with the NEIC application, are implemented in, among others, Turkey, Italy, Romania, Switzerland, several countries. The comprehensive site response software is being distributed and used both inside and outside the project. NERIES organises several workshops inviting both consortium and non-consortium participants and covering a wide range of subjects: ‘Seismological observatory operation tools', ‘Tomography', ‘Ocean bottom observatories', 'Site response software training

Liquefaction has been a source of major damages during severe earthquakes. To evaluate this phenomenon there are several stress, strain and energy based approaches. Use of the energy method has been more focused by researchers due to its advantages with respect to other approaches. The use of the energy concept to define the liquefaction potential is validated through laboratory element and centrifuge tests as well as field studies. This approach is based on the hypothesis that pore pressure buildup is directly related to the dissipated energy in sands which is the accumulated areas between the stress-strain loops. Numerous investigations were performed to find a relationship which correlates the dissipated energy to the soil parameters, but there are not sufficient studies to relate this dissipated energy, known as demand energy, concurrently, to the seismological and the soil parameters. The aim of this paper is to investigate the dependency of the demand energy in sands to seismological and the soil parameters. To perform this task, an effective stress analysis has been executed using FLAC finite difference program. Finn model, which is a built-in constitutive model implemented in FLAC program, was utilized. Since an important stage to predict the liquefaction is the prediction of excess pore water pressure at a given point, a simple numerical framework is presented to assess its generation during a cyclic loading in a given centrifuge test. According to the results, predicted excess pore water pressures did not closely match to the measured excess pore water pressure values in the centrifuge test but they can be used in the numerical assessment of excess pore water pressure with an acceptable degree of preciseness. Subsequently, the centrifuge model was reanalyzed using several real earthquake acceleration records with different seismological parameters such as earthquake magnitude and Hypocentral distance. The accumulated energies (demand energy) dissipated in

The Hellenic arc and the adjacent areas of the Greek mainland are the most active in western Eurasia and some of the most seismically active zones of the world. The seismicity of South Aegean is extremely high and is characterised by the frequent occurrence of large shallow and intermediate depth earthquakes. Until 2004, the installed seismological stations from several providers (NOA, GEOFON, MEDNET) provide average interstation distance around 130km resulting to catalogues with minimum magnitude of completeness (Mc) equals to 3.7. Towards to the direction of providing dense and state of the art instrumental coverage of seismicity in the South Aegean, HSNC begun its operation in 2004. Today it consists of (12) permanent seismological stations equipped with short period and broadband seismographs coupled with 3rd generation 24bit data loggers as well as from (2) accelerographs . The addition of HSNC along with combined use of all the active networks in South Aegean area (NOA, GEOFON, AUTH) decrease the average interstation distance to 60km and provide catalogues with Mc≥3.2. Data transmission and telemetry is implemented by a hybrid network consisting of dedicated wired ADSL links as well as VSAT links by using a unique private satellite hub. Real time data spread over collaborating networks (AUTH) and laboratories (Department of Earth Science - UCL) while at the same time, events are appended automatically and manually to EMSC database. Additional value to the network is provided by means of prototype systems which deployed in-situ for the purposes of: a) Acquiring aftershock data in the minimum time after main event. This is a mobile seismological network called RaDeSeis (Rapid Deployment Seismological network) which consists of a central station acting also as the central communication hub and wifi coupled mobile stations. b) The development of dedicated hardware and software solutions for rapid installation times (around 1 hour for each station) leading to

Full Text Available The proliferation of mobile communication and computing devices, in particular smart mobile phones, is almost paralleled with the increasing number of mobile device forensics tools in the market. Each mobile forensics tool vendor, on one hand claims to have a tool that is best in terms of performance, while on the other hand each tool vendor seems to be using different standards for testing their tools and thereby defining what support means differently. To overcome this problem, a testing framework based on a series of tests ranging from basic forensics tasks such as file system reconstruction up to more complex ones countering antiforensic techniques is proposed. The framework, which is an extension of an existing effort done in 2010, prescribes a method to clearly circumscribe the term support into precise levels. It also gives an idea of the standard to be developed and accepted by the forensic community that will make it easier for forensics investigators to quickly select the most appropriate tool for a particular mobile device.

Forensic podiatry is a small, but potentially useful specialty using clinical podiatric knowledge for the purpose of person identification. The practice of forensic podiatry began in the early 1970s in Canada and the UK, although supportive research commenced later in the 1990s. Techniques of forensic podiatry include identification from podiatry records, the human footprint, footwear, and the analysis of gait forms captured on Closed Circuit Television Cameras. The most valuable techniques relate to the comparison of the foot impressions inside shoes. Tools to describe, measure and compare foot impressions with footwear wear marks have been developed through research with potential for further development. The role of forensic podiatrists is of particular value when dealing with variable factors relating to the functioning and the shod foot. Case studies demonstrate the approach of podiatrists, in footwear identification, when comparing exemplar with questioned foot impressions. Forensic podiatry practice should be approached cautiously and it is essential for podiatrists undertaking this type of work to understand the context within which the process of person identification takes place.

Abstract Several forensic sciences, especially of the pattern-matching kind, are increasingly seen to lack the scientific foundation needed to justify continuing admission as trial evidence. Indeed, several have been abolished in the recent past. A likely next candidate for elimination is bitemark identification. A number of DNA exonerations have occurred in recent years for individuals convicted based on erroneous bitemark identifications. Intense scientific and legal scrutiny has resulted. An important National Academies review found little scientific support for the field. The Texas Forensic Science Commission recently recommended a moratorium on the admission of bitemark expert testimony. The California Supreme Court has a case before it that could start a national dismantling of forensic odontology. This article describes the (legal) basis for the rise of bitemark identification and the (scientific) basis for its impending fall. The article explains the general logic of forensic identification, the claims of bitemark identification, and reviews relevant empirical research on bitemark identification—highlighting both the lack of research and the lack of support provided by what research does exist. The rise and possible fall of bitemark identification evidence has broader implications—highlighting the weak scientific culture of forensic science and the law's difficulty in evaluating and responding to unreliable and unscientific evidence. PMID:28852538

Apart from an early case report from China (13th century), the first observations on insects and other arthropods as forensic indicators were documented in Germany and France during mass exhumations in the 1880s by Reinhard, who is considered a co-founder of the discipline. After the French publication of Mégnin's popular book on the applied aspects of forensic entomology, the concept quickly spread to Canada and United States. At that time, researchers recognized that the lack of systematic observations of insects of forensic importance jeopardized their use as indicators of postmortem interval. General advances in insect taxonomy and ecology helped to fill this gap over the following decades. After World Wars, few forensic entomology cases were reported in the scientific literature. From 1960s to the 1980s, Leclercq and Nuorteva were primarily responsible for maintaining the method in Central Europe, reporting isolated cases. Since then, basic research in the USA, Russia and Canada opened the way to the routine use of Entomology in forensic investigations. Identifications of insects associated with human cadavers are relatively few in the literature of the Neotropical region and have received little attention in Brazil. This article brings an overview of historic developments in this field, the recent studies and the main problems and challenges in South America and mainly in Brazil.

The need for computer intrusion forensics arises from the alarming increase in the number of computer crimes that are committed annually. After a computer system has been breached and an intrusion has been detected, there is a need for a computer forensics investigation to follow. Computer forensics is used to bring to justice, those responsible for conducting attacks on computer systems throughout the world. Because of this the law must be follow precisely when conducting a forensics investi...

Forensic botany is a study of judicial plant evidence. Recently, researches on DNA labeling technology have been a mainstream of forensic botany. The article systematically reviews various types of DNA labeling techniques in forensic botany with enumerated practical cases, as well as the potential forensic application of each individual technique. The advantages of the DNA labeling technology over traditional morphological taxonomic methods are also summarized.

Part 2: FORENSIC MODELS; International audience; Digital forensic readiness enables an organization to prepare itself to perform digital forensic investigations in an efficient and effective manner. The benefits include enhancing the admissibility of digital evidence, better utilization of resources and greater incident awareness. However, a harmonized process model for digital forensic readiness does not currently exist and, thus, there is a lack of effective and standardized implementations...

Introduction: Forensic odontology or forensic dentistry is that aspect of forensic science that uses the application of dental science for the identification of unknown human remains and bite marks. Deaths resulting from mass disasters such as plane crash or fire incidence have always been given mass burial in Nigeria.

Journal of the American Academy of Child & Adolescent Psychiatry, 2011

2011-01-01

This Parameter addresses the key concepts that differentiate the forensic evaluation of children and adolescents from a clinical assessment. There are ethical issues unique to the forensic evaluation, because the forensic evaluator's duty is to the person, court, or agency requesting the evaluation, rather than to the patient. The forensic…

The article provides a history of efforts to develop a credentialing or certification process for forensic interviewers and reviews the multitiered credentialing process offered by the National Association of Certified Child Forensic Interviewers. The authors argue the benefits of a credentialing process for forensic interviewers and respond to…

This paper was based on a survey of the knowledge of forensic odontology among professionals in medicine, dentistry, law and the law enforcement agents. The results show low level knowledge of forensic odontology among the professionals. It is recommended that forensic odontology be introduced as a course in dental ...

Forensic Face Recognition (FFR) is the use of biometric face recognition for several appli- cations in forensic science. Biometric face recognition uses the face modality as a means to discriminate between human beings; forensic science is the application of science and tech- nology to law

This paper deals with forensically interesting features of the Sony Playstation 3 game console. The construction and the internal structure are analyzed more precisely. Interesting forensic features of the operating system and the file system are presented. Differences between a PS3 with and without jailbreak are introduced and possible forensic attempts when using an installed Linux are discussed.

The proper identification of the insect and arthropod species of forensic importance is the most crucial element in the field of forensic entomology. The main objective in this study was the identification of insects' species of forensic importance in Urmia (37°, 33 N. and 45°, 4, 45 E.) and establishment of a preliminary ...

"Forensics," in its most universal sense, is defined as the use of science or technology in the investigation and establishment of facts or evidence for determining identity or relatedness. Most forensic reasoning is used for arguing legal matters. However, forensic studies are also used in agronomy, biology, chemistry, geology, and…

This study examines the relationship between intercollegiate forensics competitors' organizational identification and organizational culture. Through a survey analysis of 314 intercollegiate forensics students, this study reports three major findings. First, this study found male competitors identify with forensics programs more than female…

Schroedinger's original quantization procedure is revisited in the light of Nelson's stochastic framework of quantum mechanics. It is clarified why Schroedinger's proposal of a variational problem led us to a true description of quantum mechanics. (orig.)

In recent decades a new profession has developed--clinical criminology. The purpose of this article is to highlight its development. Criminology is defined as a interdisciplinary super-profession. We tend to view criminology as a basic profession with a number of specializations. Clinical criminology is one of these specializations. Forensic psychiatry and clinical criminology have common roots in psychiatry, law and behavioural sciences. They overlap in some fields. Members of both professions work in the same setting and share some of the tasks, but the formal and professional responsibilities differ significantly. We perceive clinical criminology and forensic psychiatry as complementary professions belonging to medicine. The multidisciplinary educated clinical criminologist is the only professional in the forensic system who is qualified to moderate between the mental health and legal expert.

to routine STR profiling, use of SNaPshot is an important part of the development of SNP sets for a wide range of forensic applications with these markers, from genotyping highly degraded DNA with very short amplicons to the introduction of SNPs to ascertain the ancestry and physical characteristics......This review explores the key factors that influence the optimization, routine use, and profile interpretation of the SNaPshot single-base extension (SBE) system applied to forensic single-nucleotide polymorphism (SNP) genotyping. Despite being a mainly complimentary DNA genotyping technique...... of an unidentified contact trace donor. However, this technology, as resourceful as it is, displays several features that depart from the usual STR genotyping far enough to demand a certain degree of expertise from the forensic analyst before tackling the complex casework on which SNaPshot application provides...

Documentation and evaluation of dental injuries in forensic medicine are rather problematic. It needs a professional work up why dental injuries are out of focus, and how the diagnosis, pattern and treatment are influenced by novel approaches of dentistry. The aims of the authors were to characterize dental injuries, to compare their own findings to literature data concerning the type and characteristics of injuries, and propose a diagnostic workflow. Expert's reports between 2009 and 2013 at the Department of Forensic Medicine, University of Szeged were reviewed. Review of about 7000 reports revealed only 20 cases with dental injury, which is in contrast with literature data indicating a significantly higher frequency of dental injuries. Although the number of "dental cases" was low, there were several additional cases where the trauma probably affected the teeth but the injury was not documented. In future more attention is needed in forensic evaluation of the mechanism, therapeutic strategy and prognosis of dental injuries.

Full Text Available Despite increased attention to internal controls and risk assessment, traditional audit approaches do not seem to be highly effective in uncovering the majority of frauds. Less than 20 percent of all occupational frauds are uncovered by auditors. Forensic accounting has recognized the need for automated approaches to fraud analysis yet research has not examined the benefits of forensic continuous auditing as a method to detect and deter corporate fraud. The purpose of this paper is to show how such an approach is possible. A model is presented that supports the acceptance of forensic continuous auditing by auditors and management as an effective tool to support the audit function, meet managementâ€™s regulatory objectives, and to combat fraud. An approach to developing such a system is presented.

The interpretation of data from the nuclear forensic analysis of illicit nuclear material of unknown origin requires comparative data from samples of known origin. One way to provide such comparative data is to create a system of national nuclear forensics libraries, in which each participating country stores information about nuclear or other radioactive material that either resides in or was manufactured by that country. Such national libraries could provide an authoritative record of the material located in or produced by a particular country, and thus forms an essential prerequisite for a government to investigate illicit uses of nuclear or other radioactive material within its borders. We describe the concept of the national nuclear forensic library, recommendations for content and structure, and suggested querying methods for utilizing the information for addressing nuclear smuggling.

Full Text Available The purpose of this article is to develop a comprehensive process for identifying and addressing primarily ethical issues related to the psychology profession in South Africa. In fulfilling this purpose, research was conducted of relevant ethical and to a lesser extent, legal aspects pertaining to the psychology profession. In an attempt to prevent unprofessional conduct claims against psychologists from succeeding and to alert psychologists to the concurrent ethical problems that may lead to malpractice suits, this article offers material on some important issues – in the context of forensic psychology – such as ethical decision-making and principles, professional ethics, the regulation of psychology as a profession, the Ethical Code of Professional Conduct to which a psychologist should adhere, ethical aspects and issues pertaining to forensic psychology in general, some ethical issues pertaining to child forensic psychology, summary guidelines for ethical decision-making and some steps to follow to ensure sound ethical decisionmaking.

The author's thoughts and opinions on where the field of forensic DNA testing is headed for the next decade are provided in the context of where the field has come over the past 30 years. Similar to the Olympic motto of ‘faster, higher, stronger’, forensic DNA protocols can be expected to become more rapid and sensitive and provide stronger investigative potential. New short tandem repeat (STR) loci have expanded the core set of genetic markers used for human identification in Europe and the USA. Rapid DNA testing is on the verge of enabling new applications. Next-generation sequencing has the potential to provide greater depth of coverage for information on STR alleles. Familial DNA searching has expanded capabilities of DNA databases in parts of the world where it is allowed. Challenges and opportunities that will impact the future of forensic DNA are explored including the need for education and training to improve interpretation of complex DNA profiles. PMID:26101278

Full Text Available Forensic radiology is a specialized area of medical imaging utilizing radiological techniques to assist physicians and pathologists in matter pertaining to the law. Postmortem dental radiographs are the most consistent part of the antemortem records that can be transmitted during the forensic examination procedures. Pathologists regularly use radiographic images during the course of autopsy to assist them in identification of foreign bodies or determination of death. Forensic radiology can be used in suspicious death or murder, in analysis of adverse medical events, solving legal matters, to detect child abuse, drug trafficking, body identification and disease identification. Using the possibilities of radiology, special characteristics of the internal structures of the dentomaxillofacial region can be revealed. We can also detect endodontic treatments, healing extraction sockets, implants or even tooth colored restoration. Therefore, we can give answers to problems dealing with identification procedures, mass disaster and dental age estimation.

Purpose - The overall purpose of this study is to explore tourists' perceptions and their intention to revisit Norway. The aim is to find out what are the factors that drive the overall satisfaction, the willingness to recommend and the revisit intention of international tourists that spend their holiday in Norway. Design-Method-Approach - the Theory of Planned Behavior (Ajzen 1991), is used as a framework to investigate tourists' intention and behavior towards Norway as destination. The o...

Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

Age determination of unknown human bodies is important in the setting of a crime investigation or a mass disaster because the age at death, birth date, and year of death as well as gender can guide investigators to the correct identity among a large number of possible matches. Traditional morphological methods used by anthropologists to determine age are often imprecise, whereas chemical analysis of tooth dentin, such as aspartic acid racemization, has shown reproducible and more precise results. In this study, we analyzed teeth from Swedish individuals using both aspartic acid racemization and radiocarbon methodologies. The rationale behind using radiocarbon analysis is that aboveground testing of nuclear weapons during the cold war (1955–1963) caused an extreme increase in global levels of carbon-14 (14C), which has been carefully recorded over time. Forty-four teeth from 41 individuals were analyzed using aspartic acid racemization analysis of tooth crown dentin or radiocarbon analysis of enamel, and 10 of these were split and subjected to both radiocarbon and racemization analysis. Combined analysis showed that the two methods correlated well (R2 = 0.66, p Aspartic acid racemization also showed a good precision with an overall absolute error of 5.4 ± 4.2 years. Whereas radiocarbon analysis gives an estimated year of birth, racemization analysis indicates the chronological age of the individual at the time of death. We show how these methods in combination can also assist in the estimation of date of death of an unidentified victim. This strategy can be of significant assistance in forensic casework involving dead victim identification. PMID:19965905

A seismological and geological investigation for earthquake hazard in the Greater Accra Metropolitan Area was undertaken. The research was aimed at employing a methematical model to estimate the seismic stress for the study area by generating a complete, unified and harmonized earthquake catalogue spanning 1615 to 2012. Seismic events were souced from Leydecker, G. and P. Amponsah, (1986), Ambraseys and Adams, (1986), Amponsah (2008), Geological Survey Department, Accra, Ghana, Amponsah (2002), National Earthquake Information Service, United States Geological Survey, Denver, Colorado 80225, USA, the International Seismological Centre and the National Data Centre of the Ghana Atomic Energy Commission. Events occurring in the study area were used to create and Epicentral Intensity Map and a seismicity map of the study area after interpolation of missing seismic magnitudes. The least square method and the maximum likelihood estimation method were employed to evaluate b-values of 0.6 and 0.9 respectively for the study area. A thematic map of epicentral intensity superimposed on the geology of the study area was also developed to help understand the relationship between the virtually fractured, jointed and sheared geology and the seismic events. The results obtained are indicative of the fact that the stress level of GAMA has a telling effect on its seismicity and also the events are prevalents at fractured, jointed and sheared zones. (au)

The CTBTO Link to the database of the International Seismological Centre (ISC) is a project to provide access to seismological data sets maintained by the ISC using specially designed interactive tools. The Link is open to National Data Centres and to the CTBTO. By means of graphical interfaces and database queries tailored to the needs of the monitoring community, the users are given access to a multitude of products. These include the ISC and ISS bulletins, covering the seismicity of the Earth since 1904; nuclear and chemical explosions; the EHB bulletin; the IASPEI Reference Event list (ground truth database); and the IDC Reviewed Event Bulletin. The searches are divided into three main categories: The Area Based Search (a spatio-temporal search based on the ISC Bulletin), the REB search (a spatio-temporal search based on specific events in the REB) and the IMS Station Based Search (a search for historical patterns in the reports of seismic stations close to a particular IMS seismic station). The outputs are HTML based web-pages with a simplified version of the ISC Bulletin showing the most relevant parameters with access to ISC, GT, EHB and REB Bulletins in IMS1.0 format for single or multiple events. The CTBTO Link offers a tool to view REB events in context within the historical seismicity, look at observations reported by non-IMS networks, and investigate station histories and residual patterns for stations registered in the International Seismographic Station Registry.

The GINGERino ring laser gyroscope (RLG) is a new large observatory-class RLG located in Gran Sasso underground laboratory (LNGS), one national laboratory of the INFN (Istituto Nazionale di Fisica Nucleare). The GINGERino apparatus funded by INFN in the context of a larger project of fundamental physics is intended as a pathfinder instrument to reach the high sensitivity needed to observe general relativity effects; more details are found at the URL (https://web2.infn.it/GINGER/index.php/it/). The sensitivity reached by our instrument in the first year after the set up permitted us to acquire important seismological data of ground rotations during the transit of seismic waves generated by seisms at different epicentral distances. RLGs are in fact the best sensors for capturing the rotational motions associated with the transit of seismic waves, thanks to the optical measurement principle, these instruments are in fact insensitive to translations. Ground translations are recorded by two seismometers: a Nanometrics Trillium 240 s and Guralp CMG 3T 360 s, the first instrument is part of the national earthquake monitoring program of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) and provides the ground translation data to be compared to the RLG rotational data. We report the waveforms and the seismological analysis of some seismic events recorded during our first year of activity inside the LNGS laboratory.

Full Text Available We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control – routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine web service of the Data Management Center (DMC at the Incorporated Research Institutions for Seismology (IRIS.

Urban seismology has become an active research field in the recent years, both with seismological objectives, as obtaining better microzonation maps in highly populated areas, and with engineering objectives, as the monitoring of traffic or the surveying of historical buildings. We analyze here the seismic records obtained by a broad-band seismic station installed in the ICTJA-CSIC institute, located near the center of Barcelona city. Although this station was installed to introduce visitors to earth science during science fairs and other dissemination events, the analysis of the data has allowed to infer results of interest for the scientific community. The main results include the evidence that urban seismometers can be used as a easy-to-use, robust monitoring tool for road traffic and subway activity inside the city. Seismic signals generated by different cultural activities, including rock concerts, fireworks or football games, can be detected and discriminated from its seismic properties. Beside the interest to understand the propagation of seismic waves generated by those rather particular sources, those earth shaking records provide a powerful tool to gain visibility in the mass media and hence have the opportunity to present earth sciences to a wider audience.

Flare-excited longitudinal intensity oscillations in hot flaring loops have been recently detected by SDO/AIA in 94 and 131 Å bandpasses. Based on the interpretation in terms of a slow-mode wave, quantitative evidence of thermal conduction suppression in hot (>9 MK) loops has been obtained for the first time from measurements of the polytropic index and phase shift between the temperature and density perturbations (Wang et al. 2015, ApJL, 811, L13). This result has significant implications in two aspects. One is that the thermal conduction suppression suggests the need of greatly enhanced compressive viscosity to interpret the observed strong wave damping. The other is that the conduction suppression provides a reasonable mechanism for explaining the long-duration events where the thermal plasma is sustained well beyond the duration of impulsive hard X-ray bursts in many flares, for a time much longer than expected by the classical Spitzer conductive cooling. In this study, we model the observed standing slow-mode wave in Wang et al. (2015) using a 1D nonlinear MHD code. With the seismology-derived transport coefficients for thermal conduction and compressive viscosity, we successfully simulate the oscillation period and damping time of the observed waves. Based on the parametric study of the effect of thermal conduction suppression and viscosity enhancement on the observables, we discuss the inversion scheme for determining the energy transport coefficients by coronal seismology.

Citizen seismology encourages the public involvement to data collection, analysis, and reporting, and has the potential to greatly improve the emergency response to seismic hazard. This of course, is important for scientific achievement due to the dense network. We believed the value of citizen seismology and started with distributing Quake-Catcher-Network (QCN) sensor at schools in Taiwan. While working with teachers, we hoped to motivate the learning of how to read seismograms, what to see in the data, and what to teach in the class. Through lots of workshops and activities, even with near-real time earthquake game competition and board game (quake-nopoly) developed along the way, we came to realize the huge gap between what people need and what we do. And to bridge the gap, a new generation of citizen seismic network is needed. Imagine at work, you receive the alarm from sensors at home that tells you the location, size, and type of anomalous shaking events in the neighborhood. Can this future "warning" system happen, allowing citizen to do emergence response? This is a story about facing the challenge, transforming the doubt of "why do I care" to a future IoT world.

We present obspyDMT, a free, open-source software toolbox for the query, retrieval, processing and management of seismological data sets, including very large, heterogeneous and/or dynamically growing ones. ObspyDMT simplifies and speeds up user interaction with data centers, in more versatile ways than existing tools. The user is shielded from the complexities of interacting with different data centers and data exchange protocols and is provided with powerful diagnostic and plotting tools to check the retrieved data and metadata. While primarily a productivity tool for research seismologists and observatories, easy-to-use syntax and plotting functionality also make obspyDMT an effective teaching aid. Written in the Python programming language, it can be used as a stand-alone command-line tool (requiring no knowledge of Python) or can be integrated as a module with other Python codes. It facilitates data archiving, preprocessing, instrument correction and quality control - routine but nontrivial tasks that can consume much user time. We describe obspyDMT's functionality, design and technical implementation, accompanied by an overview of its use cases. As an example of a typical problem encountered in seismogram preprocessing, we show how to check for inconsistencies in response files of two example stations. We also demonstrate the fully automated request, remote computation and retrieval of synthetic seismograms from the Synthetics Engine (Syngine) web service of the Data Management Center (DMC) at the Incorporated Research Institutions for Seismology (IRIS).

This survey of the educational offerings in the Forensic Sciences was initiated to identify institutions and agencies offering educational courses and/or programs in the forensic sciences and to evaluate the availability of these programs. The information gathered by surveying members of the American Academy of Forensic Sciences reveals that…

This paper describes the main areas of civil forensic psychiatry (FP) and the skills required by psychiatric experts. Some specific areas of civil FP are discussed, including tort law reform, reliability of psychiatric evidence, contentious psychiatric disorders, and the many domains of civil FP. Civil FP is an important sub-specialty component of forensic psychiatry that requires greater emphasis in the training and continuing education of psychiatrists. A process of accrediting psychiatrists as having competency in advanced civil FP may be of value.

aggression is communicated in forensic mental health nursing records. The aim of the study was to gain insight into the discursive practices used by forensic mental health nursing staff when they record observed aggressive incidents. Textual accounts were extracted from the Staff Observation Aggression Scale......Managing aggression in mental health hospitals is an important and challenging task for clinical nursing staff. A majority of studies focus on the perspective of clinicians, and research mainly depicts aggression by referring to patient-related factors. This qualitative study investigates how...

The spread of navigation devices has increased significantly over the last 10 years. With the help of the current development of even smaller navigation receiver units it is to navigate with almost any current smart phone. Modern navigation systems are no longer limited to satellite navigation, but use current techniques, e.g. WLAN localization. Due to the increased use of navigation devices their relevance to forensic investigations has risen rapidly. Because navigation, for example with navigation equipment and smartphones, have become common place these days, also the amount of saved navigation data has risen rapidly. All of these developments lead to a necessary forensic analysis of these devices. However, there are very few current procedures for investigating of navigation devices. Navigation data is forensically interesting because by the position of the devices in most cases the location and the traveled path of the owner can be reconstructed. In this work practices for forensic analysis of navigation devices are developed. Different devices will be analyzed and it is attempted, by means of forensic procedures to restore the traveled path of the mobile device. For analysis of the various devices different software and hardware is used. There will be presented common procedures for securing and testing of mobile devices. Further there will be represented the specials in the investigation of each device. The different classes considered are GPS handhelds, mobile navigation devices and smartphones. It will be attempted, wherever possible, to read all data of the device. The aim is to restore complete histories of the navigation data and to forensically study and analyze these data. This is realized by the usage of current forensic software e.g. TomTology or Oxygen Forensic Suite. It is also attempted to use free software whenever possible. Further alternative methods are used (e.g. rooting) to access locked data of the unit. To limit the practical work the

Full Text Available Children can be credible witnesses in court procedures given an adequately conducted forensic interview with them. This paper presents the most important features of a child's development (the cognitive and socioemotional development and the development of language and communication and from these features derives the specific guidelines for forensic interviews of children. Due to the frequent belief that children can be led to false witnessing and that they do not differentiate between reality and fantasy the topics of lying and suggestibility are also discussed. At the end some practical suggestions are given with recommendations for trainings of all professionals working with children that are potential witnesses.

An accurate and precise documentation of injuries is fundamental in a forensic pathological context. Photographs and manual measurements are taken of all injuries during autopsies, but ordinary photography projects a 3D wound on a 2D space. Using technologies such as photogrammetry, it is possible...... methods (p > 0.05). The results of intra- and inter-observer tests indicated perfect agreement between the observers with mean value differences of ≤ 0.02 cm. This study demonstrated the validity of using photogrammetry for documentation of injuries in a forensic pathological context. Importantly...

Forensic toxicology has to bring evidence of substances that could have been involved directly or indirectly in the cause of death or that could influence the behaviour of somebody. The increase of the consumption of illegal and legal drugs in modern societies during last decades gave a boost to forensic toxicology. Moreover, improvement with analytical technology gave tools with high degrees of sensitivity and specificity for the screening and quantification of a large amount of substances in various biological specimens, even with very low concentration resulting of a single dose of medication.

Memory forensics is a branch of computer forensics. It does not depend on the operating system API, and analyzes operating system information from binary memory data. Based on the 64-bit Linux operating system, it analyzes system process and thread information from physical memory data. Using ELF file debugging information and propose a method for locating kernel structure member variable, it can be applied to different versions of the Linux operating system. The experimental results show that the method can successfully obtain the sytem process information from physical memory data, and can be compatible with multiple versions of the Linux kernel.

A brief survey of the history of the invention and development of super-large laser gyroscopes (SLLGs) is presented. The basic results achieved using SLLGs in geodesy, seismology, fundamental physics and other fields are summarised. The concept of SLLG design, specific features of construction and implementation are considered, as well as the prospects of applying the present-day optical technologies to laser gyroscope engineering. The possibilities of using fibre-optical gyroscopes in seismologic studies are analysed and the results of preliminary experimental studies are presented. (laser gyroscopes)

A brief survey of the history of the invention and development of super-large laser gyroscopes (SLLGs) is presented. The basic results achieved using SLLGs in geodesy, seismology, fundamental physics and other fields are summarised. The concept of SLLG design, specific features of construction and implementation are considered, as well as the prospects of applying the present-day optical technologies to laser gyroscope engineering. The possibilities of using fibre-optical gyroscopes in seismologic studies are analysed and the results of preliminary experimental studies are presented. (laser gyroscopes)

Earthquake monitoring and early warning system for hazard assessment and mitigation has traditional been based on seismic instruments. However, for large seismic events, it is difficult for traditional seismic instruments to produce accurate and reliable displacements because of the saturation of broadband seismometers and problematic integration of strong-motion data. Compared with the traditional seismic instruments, GPS can measure arbitrarily large dynamic displacements without saturation, making them particularly valuable in case of large earthquakes and tsunamis. GPS relative positioning approach is usually adopted to estimate seismic displacements since centimeter-level accuracy can be achieved in real-time by processing double-differenced carrier-phase observables. However, relative positioning method requires a local reference station, which might itself be displaced during a large seismic event, resulting in misleading GPS analysis results. Meanwhile, the relative/network approach is time-consuming, particularly difficult for the simultaneous and real-time analysis of GPS data from hundreds or thousands of ground stations. In recent years, several single-receiver approaches for real-time GPS seismology, which can overcome the reference station problem of the relative positioning approach, have been successfully developed and applied to GPS seismology. One available method is real-time precise point positioning (PPP) relied on precise satellite orbit and clock products. However, real-time PPP needs a long (re)convergence period, of about thirty minutes, to resolve integer phase ambiguities and achieve centimeter-level accuracy. In comparison with PPP, Colosimo et al. (2011) proposed a variometric approach to determine the change of position between two adjacent epochs, and then displacements are obtained by a single integration of the delta positions. This approach does not suffer from convergence process, but the single integration from delta positions to

Nuclear approaches for compositional characterization has bright application prospect in forensic perspective towards assessment of nature and origin of seized material. The macro and micro physical properties of nuclear materials can be specifically associated with a process or type of nuclear activity. Under the jurisdiction of nuclear analytical chemistry as well as nuclear forensics, thrust areas of scientific endeavor like determination of radioisotopes, isotopic and mass ratios, analysis for impurity contents, arriving at chemical forms/species and physical parameters play supporting evidence in forensic investigations. The analytical methods developed for this purposes can be used in international safeguards as well for nuclear forensics. Nuclear material seized in nuclear trafficking can be identified and a profile of the nuclear material can be created

Full Text Available This paper introduces the Digital Records Forensics project, a research endeavour located at the University of British Columbia in Canada and aimed at the development of a new science resulting from the integration of digital forensics with diplomatics, archival science, information science and the law of evidence, and of an interdisciplinary graduate degree program, called Digital Records Forensics Studies, directed to professionals working for law enforcement agencies, legal firms, courts, and all kind of institutions and business that require their services. The program anticipates the need for organizations to become â€œforensically ready,â€ defined by John Tan as â€œmaximizing the ability of an environment to collect credible digital evidence while minimizing the cost of an incident response (Tan, 2001.â€ The paper argues the need for such a program, describes its nature and content, and proposes ways of delivering it.

Full Text Available Cloud computing is a new computing paradigm that presents fresh research issues in the field of digital forensics. Cloud computing builds upon virtualisation technologies and is distributed in nature. Depending on its implementation, the cloud can...

Full Text Available Forensic odontology is a branch that connects dentistry and the legal profession. One of the members in the forensic investigation team is a dentist. Dentists play an important and significant role in various aspects of the identification of persons in various forensic circumstances. However, several dentists and legal professionals are quite ignorant of this fascinating aspect of forensic odontology. A need was felt to fill this gap. The dental record is a legal document possessed by the dentist and it contains subjective and objective information about the patient. A PubMed search and Google search were done for articles highlighting the importance of dental records in forensic sciences using the key words "forensic odontology, forensic dentistry, forensic dentists, identification, dental records, and dental chart". A total of 42 articles relevant to the title of the article were found and reviewed. The present article highlights the role of dentists in forensic sciences, their possible contributions to forensics, and the various aspects of forensic dentistry, thus bridging the gap of knowledge between the legal and the dental fraternities.

Forensic psychiatry was officially recognized as a subspecialty by the American Board of Medical Specialties in the 1990's. In 1994, the American Board of Psychiatry and Neurology (ABPN) gave its first written examination to certify forensic psychiatrists. In 1996, the Accreditation Council for Graduate Medical Education (ACGME) began to officially accredit one-year residency experiences in forensic psychiatry, which follow a 4-year residency in general psychiatry. The extra year of training, colloquially known as a fellowship, is required for candidates who wish to receive certification in the subspecialty of forensic psychiatry; since 2001, completion of a year of training in a program accredited by ACGME has been required for candidates wishing to take the ABPN forensic psychiatry subspecialty examination. With the formal recognition of the subspecialty of forensic psychiatry comes the need to examine special issues of cultural importance which apply specifically to forensic psychiatry training. This paper examines the current literature on cross-cultural issues in forensic psychiatry, sets out several of the societal reasons for the importance of emphasizing those issues in forensic psychiatric training, and discusses how those issues are addressed in the curriculum of one forensic psychiatry fellowship at the Medical College of Wisconsin (MCW). While much has been written about cross-cultural issues in general psychiatry, very little has appeared in the literature on the topic of cross-cultural issues in forensic psychiatry.

Full Text Available Computer forensics involves the investigation of digital sources to acquire evidence that can be used in a court of law. It can also be used to identify and respond to threats to hosts and systems. Accountants use computer forensics to investigate computer crime or misuse, theft of trade secrets, theft of or destruction of intellectual property, and fraud. Education of accountants to use forensic tools is a goal of the AICPA (American Institute of Certified Public Accountants. Accounting students, however, may not view information technology as vital to their career paths and need motivation to acquire forensic knowledge and skills. This paper presents a curriculum design methodology for teaching graduate accounting students computer forensics. The methodology is tested using perceptions of the students about the success of the methodology and their acquisition of forensics knowledge and skills. An important component of the pedagogical approach is the use of an annotated list of over 50 forensic web-based tools.

Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

A forensic entomological investigation can benefit from a variety of widely practiced molecular genotyping methods. The most commonly used is DNA-based specimen identification. Other applications include the identification of insect gut contents and the characterization of the population genetic structure of a forensically important insect species. The proper application of these procedures demands that the analyst be technically expert. However, one must also be aware of the extensive list of standards and expectations that many legal systems have developed for forensic DNA analysis. We summarize the DNA techniques that are currently used in, or have been proposed for, forensic entomology and review established genetic analyses from other scientific fields that address questions similar to those in forensic entomology. We describe how accepted standards for forensic DNA practice and method validation are likely to apply to insect evidence used in a death or other forensic entomological investigation.

A number of initiatives are underway in the United States in response to the 2009 critique of forensic science by a National Academy of Sciences committee. This article provides a broad review of activities including efforts of the White House National Science and Technology Council Subcommittee on Forensic Science and a partnership between the Department of Justice (DOJ) and the National Institute of Standards and Technology (NIST) to create the National Commission on Forensic Science and the Organization of Scientific Area Committees. These initiatives are seeking to improve policies and practices of forensic science. Efforts to fund research activities and aid technology transition and training in forensic science are also covered. The second portion of the article reviews standards in place or in development around the world for forensic DNA. Documentary standards are used to help define written procedures to perform testing. Physical standards serve as reference materials for calibration and traceability purposes when testing is performed. Both documentary and physical standards enable reliable data comparison, and standard data formats and common markers or testing regions are crucial for effective data sharing. Core DNA markers provide a common framework and currency for constructing DNA databases with compatible data. Recent developments in expanding core DNA markers in Europe and the United States are discussed. PMID:26164236

A number of initiatives are underway in the United States in response to the 2009 critique of forensic science by a National Academy of Sciences committee. This article provides a broad review of activities including efforts of the White House National Science and Technology Council Subcommittee on Forensic Science and a partnership between the Department of Justice (DOJ) and the National Institute of Standards and Technology (NIST) to create the National Commission on Forensic Science and the Organization of Scientific Area Committees. These initiatives are seeking to improve policies and practices of forensic science. Efforts to fund research activities and aid technology transition and training in forensic science are also covered. The second portion of the article reviews standards in place or in development around the world for forensic DNA. Documentary standards are used to help define written procedures to perform testing. Physical standards serve as reference materials for calibration and traceability purposes when testing is performed. Both documentary and physical standards enable reliable data comparison, and standard data formats and common markers or testing regions are crucial for effective data sharing. Core DNA markers provide a common framework and currency for constructing DNA databases with compatible data. Recent developments in expanding core DNA markers in Europe and the United States are discussed. Published by Elsevier Ireland Ltd.

Bloodstains at crime scenes are among the most important types of evidence for forensic investigators. They can be used for DNA-profiling for verifying the suspect's identity or for pattern analysis in order to reconstruct the crime. However, until now, using bloodstains to determine the time

The technological capability of mobile devices in particular Smartphones makes their use of value to the criminal community as a data terminal in the facilitation of organised crime or terrorism. The effective targeting of these devices from criminal and security intelligence perspectives and subsequent detailed forensic examination of the targeted device will significantly enhance the evidence available to the law enforcement community. When phone devices are involved in crimes, forensic examiners require tools that allow the proper retrieval and prompt examination of information present on these devices. Smartphones that are compliant to Global System for Mobile Communication (GSM) standards, will maintains their identity and user's personal information on Subscriber Identity Module (SIM). Beside SIM cards, substantial amount of information is stored on device's internal memory and external memory modules. The aim of this paper is to give an overview of the currently available forensic software tools that are developed to carry out forensic investigation of mobile devices and point to current weaknesses within this process.

This review gives an overview of developments in the field of microchip analysis for clinical diagnostic and forensic applications. The approach chosen to review the literature is different from that in most microchip reviews to date, in that the information is presented in terms of analytes tested

Signature extraction is a key part of forensic log analysis. It involves recognizing patterns in log lines such that log lines that originated from the same line of code are grouped together. A log signature consists of immutable parts and mutable parts. The immutable parts define the signature, and

Forensic image recognition tools are used by law enforcement agencies all over the world to automatically detect illegal images on confiscated equipment. This detection is commonly done with the help of a strictly confidential database consisting of hash values of known illegal images. To detect and

This article presents an experiment designed to provide students, in a classroom laboratory setting, a hands-on demonstration of the steps used in DNA forensic analysis by performing DNA extraction, DNA fingerprinting, and statistical analysis of the data. This experiment demonstrates how DNA fingerprinting is performed and how long it takes. It…

Full Text Available acquisition should take place to ensure forensic soundness. At the time of writing, this Liforac model is the first document of this nature that could be found for analysis. It serves as a foundation for future models that can refine the current processes....

Face recognition is a challenging problem for surveillance view images commonly encountered in a forensic face recognition case. One approach to deal with a non-frontal test image is to synthesize the corresponding frontal view image and compare it with frontal view reference images. However, it is

An open forensic rehabilitation ward provides an important link bridging the gap between secure and community provisions. This paper provides an audit of such a service by examining the records of an open forensic rehabilitation ward over a five-year period from 1 June 2000 until 31 May 2005. During the audit period there were 51 admissions, involving 45 different patients, and 50 discharges. The majority of the patients came from secure unit facilities, acute psychiatric wards or home. Thirty-nine patients were discharged either into hostels (66%) or their home (12%). The majority of patients (80%) had on admission a primary diagnosis of either schizophrenia or schizoaffective disorder. Most had an extensive forensic history. The focus of their admission was to assess and treat their mental illness/disorder and offending behaviour and this was successful as the majority of patients were transferred to a community placement after a mean of 15 months. It is essential that there is a well-integrated care pathway for forensic patients, involving constructive liaison with generic services and a well-structured treatment programme which integrates the key principles of the 'recovery model' approach to care.

Presents an activity for students to determine the sex and age of an individual from a collection of bones. Simulates some of the actual procedures conducted in a forensic anthropologist's lab, examining and identifying bones through a series of lab activities. (Author/ASK)

PIXE measurements were performed on various calcareous materials including identified bone residues, human cremains, and samples of disputed origin. In a forensic application, the elemental analysis suggests that the origin of a sample suspectly classified as human cremains can tentatively be identified as a mixture of sandy soil and dolomitic limestone

Wearables are an increasingly big item in mobile forensics, in large part due to the ever increasing popularity of social media. A device that falls into this category is Google Glass. A big part of the Google Glass interface is dedicated to social media functions. A side-effect of these functions

Radiation detection is necessary for isotope identification and assay in nuclear forensic applications. The principles of operation of gas proportional counters, scintillation counters, germanium and silicon semiconductor counters will be presented. Methods for calibration and potential pitfalls in isotope quantification will be described.

htmlabstractFile carvers are forensic software tools used to recover data from storage devices in order to find evidence. Every legal case requires different trade-offs between precision and runtime performance. The resulting required changes to the software tools are performed manually and under

Full Text Available Presently, lawyers, law enforcement agencies, and judges in courts use speech and other biometric features to recognize suspects. In general, speaker recognition is used for discriminating people based on their voices. The process of determining, if a suspected speaker is the source of trace, is called forensic speaker recognition. In such applications, the voice samples are most probably noisy, the recording sessions might mismatch each other, the sessions might not contain sufficient recording for recognition purposes, and the suspect voices are recorded through mobile channel. The identification of a person through his voice within a forensic quality context is challenging. In this paper, we propose a method for forensic speaker recognition for the Arabic language; the King Saud University Arabic Speech Database is used for obtaining experimental results. The advantage of this database is that each speaker’s voice is recorded in both clean and noisy environments, through a microphone and a mobile channel. This diversity facilitates its usage in forensic experimentations. Mel-Frequency Cepstral Coefficients are used for feature extraction and the Gaussian mixture model-universal background model is used for speaker modeling. Our approach has shown low equal error rates (EER, within noisy environments and with very short test samples.

Full Text Available -jurisdictions. As such, service providers hosting the data that may be required for digital forensic investigation may be reluctant to comply with foreign law enforcement agencies. Even if they comply, this may be a costly and time-consuming exercise, given the amount...

Radiation detection is necessary for isotope identification and assay in nuclear forensic applications. The principles of operation of gas proportional counters, scintillation counters, germanium and silicon semiconductor counters will be presented. Methods for calibration and potential pitfalls in isotope quantification will be described.

In this article, the quality of life (QoL) of mentally disordered offenders was investigated. The data of 44 forensic psychiatric inpatients were analyzed using the Lancashire Quality of Life Profile (LQoLP), Rehabilitation Evaluation Hall and Baker (REHAB), and the Psychopathy Checklist-Revised

Full Text Available In many economically struggling societies forensic psychiatry is still in its initial developmental stages and thus forensic patients pose an ongoing challenge for the healthcare and juridical systems. In this article we present the various issues and problems that arose when establishing the first forensic psychiatric institute in Kosovo- a country whose population has constantly been reported as suffering from a high psychiatric morbidity due to long-lasting traumatic experiences during the war of 1999. The implementation of a new forensic psychiatric institute in the developing mental healthcare system of Kosovo, still characterized by considerable shortages, required substantial effort on various levels. On the policy and financial level, it was made possible by a clear intent and coordinated commitment of all responsible national stakeholders and authorities, such as the Ministries of Health and Justice, and by the financial contribution of the European Commission. Most decisive in terms of the success of the project was capacity building in human resources, i.e. the recruitment and training of motivated staff. Training included essential clinical and theoretical issues as well as clearly defined standard operation procedures, guidelines and checklists to aid daily routine work and the management of challenging situations.

When two biometric specimens are compared using an automatic biometric recognition system, a similarity metric called “score‿ can be computed. In forensics, one of the biometric specimens is from an unknown source, for example, from a CCTV footage or a fingermark found at a crime scene and the other

Full Text Available This paper presents a real case of digital forensic analysis in organizational fraud auditing process investigated using two different forensic tools, namely Tableau TD3 Touch Screen Forensic Imager and Access Data FTK Imager. Fraud auditing is more of a mindset than a methodology and has different approaches from financial auditing. Fraud auditors are mostly focused on exceptions, accounting irregularities, and patterns of their conduct. Financial auditors place special emphasis on the audit trail and material misstatements. A fraud case investigation of non-cash misappropriations committed by an employee, the warehouseman, will be presented herein in order to highlight the usefulness of fraud auditing, which can reveal many forms of financial crime and can be used in both private and public sector companies. Due to the computerized accounting environment, fraud investigation requires a combination of auditing, computer crime and digital forensic investigation skills, which can be achieved through joint efforts and cooperation of both digital investigator and fraud auditor as proposed herein.

In the years 1999 - 2000 two regional seismic refraction lines were performed within a close cooperation with German partners from University of Karlsruhe. One of these lines is Vrancea 2001, with 420 km in length, almost half of them recorded in Transylvanian Basin. The structure of the crust along the seismic line revealed a very complicated crustal structure beginning with Eastern Carpathians and continuing in the Transylvanian Basin until Medias. As a result of the development of the National Seismic Network in the last ten years, more than 100 permanent broadband stations are now continuously operating in Romania. Complementary to this national dataset, maintained and developed in the National Institute for Earth Physics, new data emerged from the temporary seismologic networks established during the joint projects with European partners in the last decades. The data gathered so far is valuable both for seismology purposes and crustal structure studies, especially for the western part of the country, where this kind of data were sparse until now. Between 2009 and 2011, a new reference model for the Earth’s crust and mantle of the European Plate was defined through the NERIES project from existing data and models. The database gathered from different kind of measurements in Transylvanian Basin and eastern Pannonian Basin were included in this NERIES model and an improved and upgraded model of the Earth crust emerged for western part of Romania. Although the dataset has its origins in several periods over the last 50 years, the results are homogeneous and they improve and strengthen our image about the depth of the principal boundaries in the crust. In the last chapter two maps regarding these boundaries are constructed, one for mid-crustal boundary and one for Moho. They were build considering all the punctual information available from different sources in active seismic and seismology which are introduced in the general maps from the NERIES project for

Exploration seismology deals with highly coherent wave fields generated by repeatable controlled sources and recorded by dense receiver arrays, whose geometry is tailored to back-scattered energy normally neglected in earthquake seismology. Owing to these favorable conditions, stacking and coherence analysis are routinely employed to suppress incoherent noise and regularize the data, thereby strongly contributing to the success of subsequent processing steps, including migration for the imaging of back-scattering interfaces or waveform tomography for the inversion of velocity structure. Attempts have been made to utilize wave field coherence on the length scales of passive-source seismology, e.g. for the imaging of transition-zone discontinuities or the core-mantle-boundary using reflected precursors. Results are however often deteriorated due to the sparse station coverage and interference of faint back-scattered with transmitted phases. USArray sampled wave fields generated by earthquake sources at an unprecedented density and similar array deployments are ongoing or planned in Alaska, the Alps and Canada. This makes the local coherence of earthquake data an increasingly valuable resource to exploit.Building on the experience in controlled-source surveys, we aim to extend the well-established concept of beam-forming to the richer toolbox that is nowadays used in seismic exploration. We suggest adapted strategies for local data coherence analysis, where summation is performed with operators that extract the local slope and curvature of wave fronts emerging at the receiver array. Besides estimating wave front properties, we demonstrate that the inherent data summation can also be used to generate virtual station responses at intermediate locations where no actual deployment was performed. Owing to the fact that stacking acts as a directional filter, interfering coherent wave fields can be efficiently separated from each other by means of coherent subtraction. We

In parallel with cooperative developments in seismology during the past 25 years, there have been phenomenal advances in mineral/rock physics making laboratory-based interpretation of seismological models increasingly useful. However, the assimilation of diverse experimental data into a physically sound framework for seismological application is not without its challenges as demonstrated by two examples. In the first example, that of equation-of-state and elasticity data, an appropriate, thermodynamically consistent framework involves finite-strain expansion of the Helmholz free energy incorporating the Debye approximation to the lattice vibrational energy, as advocated by Stixrude and Lithgow-Bertelloni. Within this context, pressure, specific heat and entropy, thermal expansion, elastic constants and their adiabatic and isothermal pressure derivatives are all calculable without further approximation in an internally consistent manner. The opportunities and challenges of assimilating a wide range of sometimes marginally incompatible experimental data into a single model of this type will be demonstrated with reference to MgO, unquestionably the most thoroughly studied mantle mineral. A neighbourhood-algorithm inversion has identified a broadly satisfactory model, but uncertainties in key parameters associated particularly with pressure calibration remain sufficiently large as to preclude definitive conclusions concerning lower-mantle chemical composition and departures from adiabaticity. The second example is the much less complete dataset concerning seismic-wave dispersion and attenuation emerging from low-frequency forced-oscillation experiments. Significant progress has been made during the past decade towards an understanding of high-temperature, micro-strain viscoelastic relaxation in upper-mantle materials, especially as regards the roles of oscillation period, temperature, grain size and melt fraction. However, the influence of other potentially important

Full Text Available For this invited contribution, I was asked to give an overview about the application of helio and aster-oseismic techniques to study the interior of giant planets, and to specifically present the recent observations of Neptune by Kepler K2. Seismology applied to giant planets could drastically change our understanding of their deep interiors, as it has happened with the Earth, the Sun, and many main-sequence and evolved stars. The study of giant planets' composition is important for understanding both the mechanisms enabling their formation and the origins of planetary systems, in particular our own. Unfortunately, its determination is complicated by the fact that their interior is thought not to be homogeneous, so that spectroscopic determinations of atmospheric abundances are probably not representative of the planet as a whole. Instead, the determination of their composition and structure must rely on indirect measurements and interior models. Giant planets are mostly fluid and convective, which makes their seismology much closer to that of solar-like stars than that of terrestrial planets. Hence, helioseismology techniques naturally transfer to giant planets. In addition, two alternative methods can be used: photometry of the solar light reflected by planetary atmospheres, and ring seismology in the specific case of Saturn. The current decade has been promising thanks to the detection of Jupiter's acoustic oscillations with the ground-based imaging-spectrometer SYMPA and indirect detection of Saturn's f-modes in its rings by the NASA Cassini orbiter. This has motivated new projects of ground-based and space-borne instruments that are under development. The K2 observations represented the first opportunity to search for planetary oscillations with visible photometry. Despite the excellent quality of K2 data, the noise level of the power spectrum of the light curve was not low enough to detect Neptune's oscillations. The main results from the