An offender who has recently handled bulk explosives would be expected to deposit latent fingermarks that are contaminated with explosive residues. However, fingermarkdetectiontechniques need to be applied in order for these fingermarks to be detected and recorded. Little information is available in terms of how routine fingermarkdetection methods impact on the subsequent recovery and analysis of any explosive residues that may be present. If an identifiable fingermark is obtained and that fingermark is found to be contaminated with a particular explosive then that may be crucial evidence in a criminal investigation (including acts of terrorism involving improvised explosive devices). The principal aims of this project were to investigate: (i) the typical quantities of explosive material deposited in fingermarks by someone who has recently handled bulk explosives; and (ii) the effects of routine fingermarkdetection methods on the subsequent recovery and analysis of explosive residues in such fingermarks. Four common substrates were studied: paper, glass, plastic (polyethylene plastic bags), and metal (aluminium foil). The target explosive compounds were 2,4,6-trinitrotoluene (TNT), pentaerythritol tetranitrate (PETN), and hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX), as well as chlorate and nitrate ions. Recommendations are provided in terms of the application of fingermarkdetection methods on surfaces that may contain explosive residues.

Hand sanitizers have seen a rapid increase in popularity amongst the general population and this increased use has led to the belief that hand sanitizers may have an effect on subsequent fingermarkdetection. Based on this hypothesis, three alcoholic and two non-alcoholic hand sanitizers were evaluated to determine the effect they had on the detection of fingermarks deposited after their use. The following fingermarkdetection methods were applied: 1,2-indanedione-zinc, ninhydrin, physical developer (porous substrate); and cyanoacrylate, rhodamine 6G, magnetic powder (non-porous substrate). Comparison between hand sanitized fingermarks and non-hand sanitized fingermarks showed that the alcohol-based hand sanitizers did not result in any visible differences in fingermark quality. The non-alcoholic hand sanitizers, however, improved the quality of fingermarks developed with 1,2-indanedione-zinc and ninhydrin, and marginally improved those developed with magnetic powder. Different parameters, including time since hand sanitizer application prior to fingermark deposition and age of deposited mark, were tested to determine the longevity of increased development quality. The non-alcoholic hand sanitized marks showed no decrease in quality when aged for up to two weeks. The time since sanitizer application was determined to be an important factor that affected the quality of non-alcoholic hand sanitized fingermarks. It was hypothesized that the active ingredient in non-alcoholic hand sanitizers, benzalkonium chloride, is responsible for the increase in fingermark development quality observed with amino acid reagents, while the increased moisture content present on the ridges resulted in better powdered fingermarks.

Despite the proven capabilities of Matrix Assisted Laser Desorption Ionisation Mass Spectrometry (MALDI MS) in laboratory settings, research is still needed to integrate this technique into current forensic fingerprinting practice. Optimised protocols enabling the compatible application of MALDI to developed fingermarks will allow additional intelligence to be gathered around a suspect’s lifestyle and activities prior to the deposition of their fingermarks while committing a crime. The detection and mapping of illicit drugs and metabolites in latent fingermarks would provide intelligence that is beneficial for both police investigations and court cases. This study investigated MALDI MS detection and mapping capabilities for a large range of drugs of abuse and their metabolites in fingermarks; the detection and mapping of a mixture of these drugs in marks, with and without prior development with cyanoacrylate fuming or Vacuum Metal Deposition, was also examined. Our findings indicate the versatility of MALDI technology and its ability to retrieve chemical intelligence either by detecting the compounds investigated or by using their ion signals to reconstruct 2D maps of fingermark ridge details.

Over the past decade, the use of nanotechnology for fingermarkdetection has been attracting a lot of attention. A substantial number of nanoparticle types has thus been studied and applied with varying success. However, despite all efforts, few publications present clear supporting evidence of their superiority over standard and commonly used techniques. This paper focuses on a rarely studied type of nanoparticles that regroups all desired properties for effective fingermarkdetection: silicon oxide. These nanoparticles offer optical and surface properties that can be tuned to provide optimal detection. This study explores their potential as a new method for fingermarkdetection. Detection conditions, outer functionalisations and optical properties were optimised and a first evaluation of the technique is presented. Dye-doped silicon oxide nanoparticles were assessed against a one-step luminescent cyanoacrylate. Both techniques were compared on natural fingermarks from three donors collected on four different non-porous substrates. On average, the two techniques performed similarly but silicon oxide detected marks with a better homogeneity and was less affected by donor inter-variability. The technique remains to be further optimised and yet silicon oxide nanoparticles already show great promises for effective fingermarkdetection.

This paper describes the application of a luminescent lipid stain, nile red, for the development of latent fingermarks on porous surfaces. An optimised formulation is presented that provides rapid development of latent fingermarks on porous surfaces that are or have been wet. A comparison with physical developer (PD), the method of choice to enhance such fingermarks, indicated that nile red was a simpler and more stable technique for the development of fingermarks. The nile red formulation showed similar performance to PD across a range of substrates and ageing conditions, although PD still showed greater sensitivity on five-year-old examination booklets used in a pseudo-operational study. The pseudo-operational trial also indicated that nile red consistently developed different fingermarks to those enhanced by PD, suggesting that it preferentially targets a different fraction of the latent fingermark deposit. Significantly, the compatibility of nile red in a detection sequence with indanedione-zinc, ninhydrin and PD is reported.

The effectiveness of latent fingerprint development techniques is heavily influenced by the physical and chemical properties of the deposition surface. The use of powder suspensions is increasing for development of prints on a range of surfaces. We demonstrate that carbon powder suspension development on polymers is detrimentally affected by the presence of common white pigment, titanium dioxide. Scanning electron microscopy demonstrates that patches of the compound are clearly associated with increased levels of powder adhesion. Substrates with nonlocalized titanium dioxide content also exhibit increased levels of carbon powder staining on a surface-wide basis. Secondary ion mass spectrometry and complementary techniques demonstrate the importance of levels of the pigment within the top 30 nm. The association is independent of fingermark deposition and may be related to surface energy variation. The detrimental effect of the pigment is not observed with small-particle reagent (MoS2 SPR) or cyanoacrylate (superglue) fuming techniques that exploit different development mechanisms.

Infrared to visible upconversion fluorescent nanoparticles of Gd2O3 codoped with Ho3+/Yb3+ ions are synthesized via thermal decomposition process. The X-ray diffraction analysis of as-synthesized nanoparticles and annealed sample at 1000 °C has shown body-centered cubic phase of Gd2O3. The synthesized phosphor has shown intense green emission upon 980-nm excitation. High-contrast latent fingermarks on some difficult semi-porous and non-porous surfaces under 980-nm diode laser excitation were developed through powder dusting and colloidal solution spraying techniques and the results are compared with the commercial green luminescent fingermark powder. The latent fingermarks were developed on transparent (biological glass slides), single-color (aluminum foil) and multicolor (plywood, plastic bottle and book cover page) background surfaces. The present study depicts that the upconversion-based latent fingermarksdetection using Gd2O3:Ho3+/Yb3+ phosphor material is suitable over the other conventional powders and has potential for practical applications in forensic science.

The most commonly found fingermarks at crime scenes are latent and, thus, an efficient method for detecting latent fingermarks is very important. However, traditional developing techniques have drawbacks such as low detection sensitivity, high background interference, complicated operation, and high toxicity. To tackle this challenge, we employed fluorescent NaYF4:Yb,Er upconversion nanoparticles (UCNPs), which can fluoresce visible light when excited by 980 nm human-safe near-infrared light, to stain the latent fingermarks on various substrate surfaces. The UCNPs were successfully used as a novel fluorescent label for the detection of latent fingermarks with high sensitivity, low background, high efficiency, and low toxicity on various substrates including non-infiltrating materials (glass, marble, aluminum alloy sheets, stainless steel sheets, aluminum foils, and plastic cards), semi-infiltrating materials (floor leathers, ceramic tiles, wood floor, and painted wood), and infiltrating materials such as various types of papers. This work shows that UCNPs are a versatile fluorescent label for the facile detection of fingermarks on virtually any material, enabling their practical applications in forensic sciences.

A new and original reagent based on the use of highly fluorescent cadmium telluride (CdTe) quantum dots (QDs) in aqueous solution is proposed to detect weak fingermarks in blood on non-porous surfaces. To assess the efficiency of this approach, comparisons were performed with one of the most efficient blood reagents on non-porous surfaces, Acid Yellow 7 (AY7). To this end, four non-porous surfaces were studied, i.e. glass, transparent polypropylene, black polyethylene, and aluminium foil. To evaluate the sensitivity of both reagents, sets of depleted fingermarks were prepared, using the same finger, initially soaked with blood, which was then successively applied on the same surface without recharging it with blood or latent secretions. The successive marks were then cut in halves and the halves treated separately with each reagent. The results showed that QDs were equally efficient to AY7 on glass, polyethylene and polypropylene surfaces, and were superior to AY7 on aluminium. The use of QDs in new, sensitive and highly efficient latent and blood mark detectiontechniques appears highly promising. Health and safety issues related to the use of cadmium are also discussed. It is suggested that applying QDs in aqueous solution (and not as a dry dusting powder) considerably lowers the toxicity risks.

The ability to link criminal activity and identity using validated analytical approaches can be of great value to forensic scientists. Herein, the factors affecting the recovery and detection of inorganic and organic energetic material residues within chemically or physically enhanced fingermarks on paper and glass substrates are presented using micro-bore anion exchange chromatography with suppressed conductivity detection. Fingermarks on both surfaces were enhanced using aluminium powder or ninhydrin after spiking with model test mixtures or through contact with black-powder substitutes. A quantitative study of the effects of environmental/method interferences, the sweat matrix, the surface and the enhancement technique on the relative anion recovery of forensically relevant species is presented. It is shown that the analytical method could detect target analytes at the nanogram level even within excesses of enhancement reagents and their reaction products when using solid phase extraction and/or microfiltration. To our knowledge, this work demonstrates for the first time that ion chromatography can detect anions in energetic materials within fingermarks on two very different surfaces, after operational enhancement techniques commonly used by forensic scientists and police have been applied.

Often in the examination of bloodstained fingermarks discussion occurs around whether to prioritise the fingerprint evidence or focus on the biological evidence. Collecting a sample for genetic profiling may result in the loss of ridge detail that could have been used for fingerprint comparison. Fingermark enhancement and recovery methods along with sample collection methods could also compromise downstream genetic analysis. Previous forensic casework has highlighted circumstances where, after enhancement had been performed, it would have been extremely valuable to both identify the body fluid and generate a DNA profile from the same sample. We enhanced depletion series of fingermarks made in blood, using single treatments consisting of aqueous amido black, methanol-based amido black, acid yellow and leucocrystal violet, and exposure to long wave UV light. We then extracted the DNA and RNA for profiling, to assess the recovery and detection of genetic material from the enhanced fingermarks. We have shown that genetic profiling of bloodstained fingermarks can be successful after chemical enhancement; however it may still be necessary to prioritise evidence types in certain circumstances. From our results it appears that even with visible bloodstained fingermarks, leucocrystal violet can reduce the effectiveness of subsequent messenger RNA profiling. Aqueous amido black and acid yellow also have adverse effects on messenger RNA profiling of depleted fingermarks with low levels of cellular material. These results help with forensic decision-making by expanding knowledge of the extent of the detrimental effects of blood-enhancement reagents on both DNA profiling and body fluid identification using messenger RNA profiling.

Criminal investigations would be considerably improved if DNA profiles could be routinely generated from single fingermarks. Here we report a direct DNA profiling method that was able to generate interpretable profiles from 71% of 170 fingermarks. The data are based on fingermarks from all 5 digits of 34 individuals. DNA was obtained from the fingermarks using a swab moistened with Triton-X, and the fibers were added directly to one of two commercial DNA profiling kits. All profiles were obtained without increasing the number of amplification cycles; therefore, our method is ideally suited for adoption by the forensic science community. We indicate the use of the technique in a criminal case in which a DNA profile was generated from a fingermark on tape that was wrapped around a drug seizure. Our direct DNA profiling approach is rapid and able to generate profiles from touched items when current forensic practices have little chance of success.

A new, highly fluorescent dye was synthesised using oleylamine combined with a perylene dianhydride compound. The new dye was characterised by 1H NMR, UV-vis spectroscopy and fluorescence spectroscopy as well as quantum yield. The dye was absorbed onto titanium dioxide nanoparticles for use as a fingerprint detection powder. The new fluorescent powder was applied to latent fingermarks deposited onto different non-porous surfaces and compared with commercial fluorescent powders. The powder exhibits strong fluorescence at 650-700 nm under excitation at 505 nm. On glass surfaces, the new powder gave images showing tertiary-level detail of the fingermark ridges with almost no background development. Compared with current magnetic fluorescent powders, the new powder was slightly weaker in fluorescence intensity but produced significantly less background development, resulting in good contrast between the fingermark and the substrate.

Distortions of the fingermark topography are usually considered when comparing latent and exemplar fingerprints. These alterations are characterized as caused by an extrinsic action, which affects entire areas of the deposition and alters the overall flow of a series of contiguous ridges. Here we introduce a novel visual phenomenon that does not follow these principles, named fingermark ridge drift. An experiment was designed that included variables such as type of secretion (eccrine and sebaceous), substrate (glass and polystyrene), and degrees of exposure to natural light (darkness, shade, and direct light) indoors. Fingermarks were sequentially visualized with titanium dioxide powder, photographed and analyzed. The comparison between fresh and aged depositions revealed that under certain environmental conditions an individual ridge could randomly change its original position regardless of its unaltered adjacent ridges. The causes of the drift phenomenon are not well understood. We believe it is exclusively associated with intrinsic natural aging processes of latent fingermarks. This discovery will help explain the detection of certain dissimilarities at the minutiae/ridge level; determine more accurate "hits"; identify potentially erroneous corresponding points; and rethink identification protocols, especially the criteria of "no single minutiae discrepancy" for a positive identification.

A preliminary study reveals that finely divided cuprorivaite powder may be used to efficiently develop and subsequently image latent fingermarks across a range of highly patterned, coloured non-porous and semi-porous substrates using near infrared illumination and imaging. Problematic multi-coloured backgrounds provide very little interference under the illumination conditions used, and invoked fluorescence observed, when using this material. This is the first reported example of a NIR-NIR fluorophore for use within latent fingermark visualisation and offers the potential for application at the scene and in the laboratory.

The chemical composition of a fingermark potentially holds a wealth of information about the fingermark donor, which can be extracted by immunolabeling. Immunolabeling can be used to detect specific components in fingermarks; however, to be applicable in the forensic field, it should be compatible with commonly used fingerprint visualization techniques. In this study, the compatibility of immunolabeling with two different fingerprint visualization techniques, magnetic powdering and ninhydrin staining, was investigated on fingermarks deposited on glass and on nitrocellulose membranes. With dermcidin as antigen of interest, immunolabeling was performed successfully on all developed fingermarks. We can conclude that immunolabeling is compatible with magnetic powdering and ninhydrin staining, which can be of great forensic value.

This paper describes a method for lifting cyanoacrylate (CNA)-developed latent fingermarks from a glass surface and the detection of five drugs in lifted marks from fingers that had been in contact with the drugs, using Surface Assisted Laser Desorption Ionisation Time of Flight Mass Spectrometry (SALDI-TOF-MS) or Matrix Assisted Laser Desorption Ionisation TOF-MS (MALDI-TOF-MS). Two drugs of abuse (cocaine and methadone) and three therapeutic drugs (aspirin, paracetamol and caffeine) were used as contact residues. Latent fingermarks spiked with the drugs were subjected to CNA fuming followed by dusting with ARRO SupraNano™ MS black magnetic powder (SALDI-TOF-MS) or 2,5-dihydroxybenzoic acid (DHB) (MALDI-TOF-MS). The dusted mark was then exposed to solvent vapour before lifting with a commercial fingerprint lifting tape following established procedures. The presence of the drugs was then confirmed by direct analysis on the tape without further processing using SALDI- or MALDI-TOF-MS. The black magnetic fingerprint powder provided visual enhancement of the CNA-fingermark while no visual enhancement was observed for marks dusted with DHB powder. Similar [M + H](+) peaks for all the drug analytes were observed for both methods along with some sodium and potassium adducts for SALDI-MS and some major fragment ions but the SALDI signals were generally more intense. Simple exposure to acetone vapour of the CNA-developed marks enabled their effective transfer onto the tape which was crucial for subsequent MS detection of the analytes.

The ability of two mass spectrometric methods, surface-assisted laser desorption/ionization-time of flight-mass spectrometry (SALDI-TOF-MS) and direct analysis in real time (DART-MS), to detect the presence of seven common explosives (six nitro-organic- and one peroxide-type) in spiked latent fingermarks has been examined. It was found that each explosive could be detected with nanogram sensitivity for marks resulting from direct finger contact with a glass probe by DART-MS or onto stainless steel target plates using SALDI-TOF-MS for marks pre-dusted with one type of commercial black magnetic powder. These explosives also could be detected in latent marks lifted from six common surfaces (paper, plastic bag, metal drinks can, wood laminate, adhesive tape and white ceramic tile) whereas no explosive could be detected in equivalent pre-dusted marks on the surface of a commercial lifting tape by the DART-MS method due to high background interference from the tape material. The presence of TNT and Tetryl could be detected in pre-dusted latent fingermarks on a commercial lifting tape for up to 29 days sealed and stored under ambient conditions.

Chemical analysis of latent fingermarks, "touch chemistry," has the potential of providing intelligence or forensically relevant information. Matrix-assisted laser desorption ionization/time-of-flight mass spectrometry (MALDI/TOF MS) was used as an analytical platform for obtaining mass spectra and chemical images of target drugs and explosives in fingermark residues following conventional fingerprint development methods and MALDI matrix processing. There were two main purposes of this research: (1) develop effective laboratory methods for detecting drugs and explosives in fingermark residues and (2) determine the feasibility of detecting drugs and explosives after casual contact with pills, powders, and residues. Further, synthetic latent print reference pads were evaluated as mimics of natural fingermark residue to determine if the pads could be used for method development and quality control. The results suggest that artificial amino acid and sebaceous oil residue pads are not suitable to adequately simulate natural fingermark chemistry for MALDI/TOF MS analysis. However, the pads were useful for designing experiments and setting instrumental parameters. Based on the natural fingermark residue experiments, handling whole or broken pills did not transfer sufficient quantities of drugs to allow for definitive detection. Transferring drugs or explosives in the form of powders and residues was successful for preparing analytes for detection after contact with fingers and deposition of fingermark residue. One downfall to handling powders was that the analyte particles were easily spread beyond the original fingermark during development. Analyte particles were confined in the original fingermark when using transfer residues. The MALDI/TOF MS was able to detect procaine, pseudoephedrine, TNT, and RDX from contact residue under laboratory conditions with the integration of conventional fingerprint development methods and MALDI matrix. MALDI/TOF MS is a nondestructive

There are a number of studies discussing recent developments of a one-step fluorescent cyanoacrylate process. This study is a pseudo operational trial to compare an example of a one-step fluorescent cyanoacrylate product, Lumicyano™, with the two recommended techniques for plastic carrier bags; cyanoacrylate fuming followed by basic yellow 40 (BY40) dyeing and powder suspensions. 100 plastic carrier bags were collected from the place of work and the items were treated as found without any additional fingermark deposition. The bags were split into three and after treatment with the three techniques a comparable number of fingermarks were detected by each technique (average of 300 fingermarks). The items treated with Lumicyano™ were sequentially processed with BY40 and an additional 43 new fingermarks were detected. Lumicyano™ appears to be a suitable technique for the development of fingermarks on plastic carrier bags and it can help save lab space and time as it does not require dyeing or drying procedures. Furthermore, contrary to other one-step cyanoacrylate products, existing cyanoacrylate cabinets do not require any modification for the treatment of articles with Lumicyano™. To date, there is little peer reviewed articles in the literature on trials related to Lumicyano™ and this study aims to contribute to fill this gap.

During the UK riots in August 2011, large volumes of bricks and stones were used as weapons or projectiles in acts of violence or to gain illegal entry to properties. As a result, it has been emphasised that it is necessary to determine suitable chemical treatment(s) that will enable the development of fingermarks on such items in order to identify those involved. This study has undertaken the task of attempting to develop latent fingermarks on common house bricks, limestone and sandstone using current techniques including ninhydrin and fluorescence. Results produced have shown that, with fluorescent fingerprint powder, silver nitrate and superglue providing the best results, it is now possible to enhance fingermarks that were previously left undeveloped. In addition, Isomark T-1 Rapid Grey High Resolution Forensic Impression Material has proved extremely effective as an alternative method of recovering fingermarks developed with fluorescent fingerprint powder.

Optically-acquired fingermarks are widely used as evidence across law enforcement agencies as well as in the courts of law. A common technique for visualizing latent fingermarks on nonporous surfaces consists of cyanoacrylate fuming of the fingerprint material, followed by impregnation with a fluorescent dye, which under ultra violet (UV) illumination makes the fingermarks visible and thus accessible for digital recording. However, there exist critical circumstances, when the image quality is compromised due to high background scattering, high auto-fluorescence of the substrate material, or other detrimental photo-physical and photo-chemical effects such as light-induced damage to the sample. Here we present a novel near-infrared (NIR), two-photon induced fluorescence imaging modality, which significantly enhances the quality of the fingermark images, especially when obtained from highly reflective and/or scattering surfaces, while at the same time reducing photo-damage to sensitive forensic samples.

For a century, fingermark analysis has been one of the most important and common methods in forensic investigations. Modern chemical analysis technologies have added the potential to determine the molecular composition of fingermarks and possibly identify chemicals a suspect may have come into contact with. Improvements in analytical detection of the molecular composition of fingermarks is therefore of great importance. In this regard, matrix-assisted laser desorption ionization (MALDI) and laser desorption ionization (LDI) imaging mass spectrometry (IMS) have proven to be useful technologies for fingermark analysis. In these analyses, the choice of ionizing agent and its mode of deposition are critical steps for the identification of molecular markers. Here we propose two novel and complementary IMS approaches for endogenous and exogenous substance detection in fingermarks: sublimation of 2-mercaptobenzothiazol (2-MBT) matrix and silver sputtering.

Infrared laser ablation coupled to vacuum capture was employed to collect material from fingermarks deposited on surfaces of different porosity and roughness. Laser ablation at 3 μm was performed in reflection mode with subsequent capture of the ejecta with a filter connected to vacuum. Ablation and capture of standards from fingermarks was demonstrated on glass, plastic, aluminum, and cardboard surfaces. Using matrix assisted laser desorption ionization (MALDI), it was possible to detect caffeine after spiking with amounts as low as 1 ng. MALDI detection of condom lubricants and detection of antibacterial peptides from an antiseptic cream was demonstrated. Detection of explosives from fingermarks left on plastic surfaces as well as from direct deposition on the same surface using gas chromatography mass spectrometry (GC-MS) was shown. [Figure not available: see fulltext.

In the search for better or new methods/techniques to visualise fingermarks or to analyse them exploiting their chemical content, fingermarks inter-variability may hinder the assessment of the method effectiveness. Variability is due to changes in the chemical composition of the fingermarks between different donors and within the same donor, as well as to differential contact time, pressure and angle. When validating a method or comparing it with existing ones, it is not always possible to account for this type of variability. One way to compensate for these issues is to employ, in the early stages of the method development, a device generating reproducible fingermarks. Here the authors present their take on such device, as well as quantitatively describing its performance and benefits against the manual production of marks. Finally a short application is illustrated for the use of this device, at the method developmental stages, in an emerging area of fingerprinting research concerning the retrieval of chemical intelligence from fingermarks.

指纹因其特异性、稳定性以及触物留痕的特点，长期以来被誉为“证据之首”，广泛应用于身份识别。犯罪现场潜在手印的提取率、提取质量直接决定着手印查询和比对效果，进而影响手印证据作用的发挥，因此潜在手印显现技术的探索一直是指纹学领域研究热点之一。本文首先回顾了汗潜手印显现方法研究的历史及现状，并对手印显现发展各个历史阶段的起缘、内容和成果进行了梳理，按照指纹遗留客体的不同，从非渗透性、渗透性、半渗透性及适用于多种客体表面的综合显现法等方面进行分析，总结了该领域研究当前面临的重要问题，对潜在手印显现方法发展趋势研究予以前瞻性展望。%Fingermark is one of the most valuable physical evidence and generalized proof of identity for its characteristics of specialty and stability. Latent fingermark, the most common evidence, poses the most challenges when being examined to show contrast present yet non-sufficient between the mark and its support. Surface phenomena and environmental conditions play an important role in the choice of fingerprint detectiontechniques. In general, all surfaces bearing latent fingermarks can be divided into three types: porous, semi-porous and nonporous. Property of an unknown surface has to be considered before any attempt is made to develop latent fingermark. Preliminary test should be conducted on a similar surface before proceeding with any treatment of an evidential item. Along with the fingermarkdetection technology being improved, the process has been further optimized and its sensitivity and specificity being advanced. In this paper, the latest techniques and material used in different surfaces to enhance and record fingermarks are reviewed. A larger number of techniques are presented here and discussed according to surface properties from practical perspective. For example, optical

The present work is focused on a novel approach for the study and quantification of some of the physical changes to which a fingermark deposited on non-porous substrates is subjected as its ageing proceeds. Particularly, electrochemical impedance spectroscopy (EIS) technique has been applied for the first time in order to monitor the electrochemical behaviour of the system constituted by the fingermark residue and the underlying substrate. The impedance spectra proved to be significantly affected by the presence of the mark residue as well as by its ageing process. Opportune fitting operations performed on the experimental data allowed obtaining quantitative electrochemical parameters used to reach useful information on the fingermarks ageing mechanism as well as to calculate the fingermark ageing curves from which fundamental information could be potentially extrapolated.

In a further study of the thermal development of fingermarks on paper and similar surfaces, it is demonstrated that direct contact heating of the substrate using coated or ceramic surfaces at temperatures in excess of 230°C produces results superior to those obtained using hot air. Fingermarks can also be developed in this way on other cellulose-based substrates such as wood and cotton fabric, though ridge detail is difficult to obtain in the latter case. Fluorescence spectroscopy indicates that the phenomena observed during the thermal development of fingermarks can be reproduced simply by heating untreated white copy paper or filter paper, or these papers treated with solutions of sodium chloride or alanine. There is no evidence to suggest that the observed fluorescence of fingermarks heated on paper is due to a reaction of fingermark constituents on or with the paper. Instead, we maintain that the ridge contrast observed first as fluorescence, and later as brown charring, is simply an acceleration of the thermal degradation of the paper. Thermal degradation of cellulose, a major constituent of paper and wood, is known to give rise to a fluorescent product if sufficient oxygen is available [1-5]. However, the absence of atmospheric oxygen has only a slight effect on the thermal development of fingermarks, indicating that there is sufficient oxygen already present in paper to allow the formation of the fluorescent and charred products. In a depletion study comparing thermal development of fingermarks on paper with development using ninhydrin, the thermal technique was found to be as sensitive as ninhydrin for six out of seven donors. When thermal development was used in sequence with ninhydrin and DFO, it was found that only fingermarks that had been developed to the fluorescent stage (a few seconds of heating) could subsequently be developed with the other reagents. In the reverse sequence, no useful further development was noted for fingermarks that were

Nanoparticles has been incorporated into the field of forensic fingermarkdetection as the advances in nanotechnology. In recent years,quantum dots are highly respected by forensic scientist for its excellent fluorescence characteristics,however,most of the quantum dots are toxic and may cause pollution,these problems have constrained the application of quantum dots in the field of forensic science. Compared with organic dyes and metal-containing quantum dots,carbon dots have low toxicity,low pollution,excel-lent biocompatibility,have been used in medicine,biology,chemistry and many other fields. This paper reviews the application of quantum dots in fingermarkdetection,introduces the research progress of carbon dots and points out carbon dots will be an impor-tant research direction of forensic fingermarkdetection.%随着纳米技术的进步，纳米颗粒正在被逐步应用到法庭科学领域的手印检验之中。近年来，半导体量子点因其良好的荧光特性而备受国内外法庭科学家的推崇，但大多数半导体量子点具有毒性，且会对环境造成污染，这些问题制约了半导体量子点在法庭科学领域中的应用。与传统有机染料和金属内核的半导体量子点相比，碳量子点具有毒性低、污染小、生物相容性优异的特点，现已应用于医学、生物、化学等多个领域。本文综述了半导体量子点在手印显现中的应用，介绍了碳量子点的研究进展，并指出碳量子点显现手印是今后法庭科学领域的重要研究方向。

The aim of this entry is to describe and explain the main forensic uses of fingermarks and fingerprints. It defines the concepts and provides the nomenclature related to forensic dactyloscopy. It describes the structure of the papillary ridges, the organization of the information in three levels, an

Improvised explosive devices (IED) are responsible for a significant proportion of combat and civilian deaths around the world. Given the ease with which IEDs can be made, the large quantity of explosive which can be contained within or on a vehicle, and the use of VBIED in the past (for example the 2002 Bali bombing) in terrorist activities, VBIED are an ongoing concern for Defence and law enforcement agencies. Fingermark and DNA analyses are routinely used by police and forensic analysts to identify suspects involved in illegal activities. There is limited information available on the feasibility of obtaining fingermarks, fibres, hair and DNA samples following an explosive incident, or a situation whereby an IED has been rendered safe following the utilisation of an appropriate defeat or render-safe tool. The main objective of this study was to determine if fingermarks and/or DNA (from saliva and hair samples) placed on the interior and exterior of road vehicles, and on inanimate objects (such as plastic or glass bottles), are able to be obtained and analysed following the use of a vehicle-borne IED (VBIED) render-safe tool on a vehicle containing simulated explosives. The identification of fingermarks on the exterior (67.2±8.5%) and interior (43.8±17.8%) of the vehicles was possible following the use of the render-safe tool, though this was more challenging in the latter than the former. Fingermarks were also able to be identified from both plastic and glass bottles placed inside the vehicles. Polymerase chain reaction (PCR) techniques yielded DNA profiles that were able to be identified from saliva and hair samples. These preliminary results suggest that both fingermarks and DNA profiles, obtained from vehicles that have been subjected to a VBIED render-safe tool, may be used to identify persons of interest.

Significant numbers of institutions are carrying out fingermark based research, yet there appears to be little inter-institution consistency in the approaches used to assess the quality of the samples produced. Inter-institution consistency in quality assessment would lead to inevitable benefits in collaborating research projects, given that data from multiple projects may be combined, or compared. In order for such quality assessment schemes to be effectively used across multiple institutions, proficiency in using such approaches should be identified to ensure parity. Intra-institution controls on fingermark quality assessment are likely to help manage variations between researchers from the same institution and/or project(s). Proficiency testing (PT) is a popular means of comparing and monitoring the competency of individuals, whilst also assessing the validity of data and conclusions. This project aimed to develop a proficiency testing scheme for the assessment of fingermark quality for researchers. A grading system was developed to assess the quality of fingermarks generated within research projects. A large collection of test fingermark samples was created controlling variables such as force, fingermark composition and surface type. An 'inter-laboratory testing scheme' design was used for the proficiency test and established fingermark researchers participated in the project to produce known values for 6 chosen test samples for round one of the testing scheme, described in this paper. Second year BSc (Hons) Forensic Science and Forensic Investigation student participants from the host institution completed the proficiency test as part of a fingermark practical. Results indicated that student participants involved in this project were not able to demonstrate a satisfactory level of proficiency of fingermark quality assessment using this grading system, which was attributed to their relative experience in assessing the quality of fingermarks compared to 'experts

We are still in the early days of exoplanet discovery. Astronomers are beginning to model the atmospheres and interiors of exoplanets and have developed a deeper understanding of processes of planet formation and evolution. However, we have yet to map out the full complexity of multi-planet architectures or to detect Earth analogues around nearby stars. Reaching these ambitious goals will require further improvements in instrumentation and new analysis tools. In this chapter, we provide an overview of five observational techniques that are currently employed in the detection of exoplanets: optical and IR Doppler measurements, transit photometry, direct imaging, microlensing, and astrometry. We provide a basic description of how each of these techniques works and discuss forefront developments that will result in new discoveries. We also highlight the observational limitations and synergies of each method and their connections to future space missions.

Drug metabolites usually have structures of split-ring resonators (SRRs), which might lead to negative permittivity and permeability in electromagnetic field. As a result, in the UV-vis region, the latent fingermarks images of drug addicts and non drug users are inverse. The optical properties of latent fingermarks are quite different between drug addicts and non-drug users. This is a technic superiority for crime scene investigation to distinguish them. In this paper, we calculate the permittivity and permeability of drug metabolites using tight-binding model. The latent fingermarks of smokers and non-smokers are given as an example.

Drug metabolites usually have structures of split-ring resonators (SRRs), which might lead to negative permittivity and permeability in electromagnetic field. As a result, in the UV-vis region, the latent fingermarks images of drug addicts and non drug users are inverse. The optical properties of latent fingermarks are quite different between drug addicts and non-drug users. This is a technic superiority for crime scene investigation to distinguish them. In this paper, we calculate the permittivity and permeability of drug metabolites using tight-binding model. The latent fingermarks of smokers and non-smokers are given as an example.

Statistical research on fingerprint identification and the testing of automated fingerprint identification system (AFIS) performances require large numbers of forensic fingermarks. These fingermarks are rarely available. This study presents a semi-automatic method to create simulated fingermarks in large quantities that model minutiae features or images of forensic fingermarks. This method takes into account several aspects contributing to the variability of forensic fingermarks such as the number of minutiae, the finger region, and the elastic deformation of the skin. To investigate the applicability of the simulated fingermarks, fingermarks have been simulated with 5-12 minutiae originating from different finger regions for six fingers. An AFIS matching algorithm was used to obtain similarity scores for comparisons between the minutiae configurations of fingerprints and the minutiae configurations of simulated and forensic fingermarks. The results showed similar scores for both types of fingermarks suggesting that the simulated fingermarks are good substitutes for forensic fingermarks.

A team composed of Rick Pratt, Dave Puczyki, Kyle Bunch, Ryan Slaugh, Morris Good, and Doug McMakin teamed together to attempt to exploit cellular telephone features and detect if a person was carrying a cellular telephone into a Limited Area. The cell phone’s electromagnetic properties were measured, analyzed, and tested in over 10 different ways to determine if an exploitable signature exists. The method that appears to have the most potential for success without adding an external tag is to measure the RF spectrum, not in the cell phone band, but between 240 and 400MHz. Figures 1- 7 show the detected signal levels from cell phones from three different manufacturers.

Illicit nuclear materials represent a threat for the safety of the American citizens, and the detection and interdiction of a nuclear weapon is a national problem that has not been yet solved. Alleviating this threat represents an enormous challenge to current detection methods that have to be substantially improved to identify and discriminate threatening from benign incidents. Rugged, low-power and less-expensive radiation detectors and imagers are needed for large-scale wireless deployment. Detecting the gamma rays emitted by nuclear and fissionable materials, particularly special nuclear materials (SNM), is the most convenient way to identify and locate them. While there are detectors that have the necessary sensitivity, none are suitable to meet the present need, primarily because of the high occurrence of false alarms. The exploitation of neutron signatures represents a promising solution to detecting illicit nuclear materials. This work presents the development of several detector configurations such as a mobile active interrogation system based on a compact RF-Plasma neutron generator developed at LBNL and a fast neutron telescope that uses plastic scintillating-fibers developed at the University of New Hampshire. A human-portable improved Solid-State Neutron Detector (SSND) intended to replace pressurized 3He-tubes will be also presented. The SSND uses an ultra-compact CMOS-SSPM (Solid-State Photomultiplier) detector, developed at Radiation Monitoring devices Inc., coupled to a neutron sensitive scintillator. The detector is very fast and can provide time and spectroscopy information over a wide energy range including fast neutrons.

Law enforcement agencies around the world use biometrics and fingerprints to solve and fight crime. Forensic experts are needed to record fingermarks at crime scenes and to ensure those captured are of evidential value. This process needs to be automated and streamlined as much as possible to

Full Text Available We demonstrate a facile method termed candle soot coating (CSC for fast developing latent fingermarks (LFMs on various kinds of surfaces (glass, ceramic, metal, paper and adhesive tape. The CSC method can be considered as simple, fast, and low-cost as well as providing high contrast for LFM visualization in potential forensic applications.

Botnet is a network of compromised computers. It just fellow the master slave concept. Bots are comprised computers and do the tasks what ever their master orders them. Internet Relay Chat (IRC) is used for the communication between the master and bots. Information is also encrypted to avoid the effect of third party. In this paper we discuss the Botnets detectiontechniques and comparative analysis of these techniques on the basis of DNS query, History data and group activity.

Full Text Available Since time immemorial, lying has been a part of everyday life. For this reason, it has become a subject of interest in several disciplines, including psychology. The purpose of this article is to provide a general overview of the literature and thinking to date about the evolution of lie detectiontechniques. The first part explores ancient methods recorded circa 1000 B.C. (e.g., God’s judgment in Europe. The second part describes technical methods based on sciences such as phrenology, polygraph and graphology. This is followed by an outline of more modern-day approaches such as FACS (Facial Action Coding System, functional MRI, and Brain Fingerprinting. Finally, after the familiarization with the historical development of techniques for lie detection, we discuss the scope for new initiatives not only in the area of designing new methods, but also for the research into lie detection itself, such as its motives and regulatory issues related to deception.

Chemical enhancement methods for fingermark in blood deposited on the surface of a thermal paper substrate were examined. The blood-sensitive reagents compared were LCV (leuco crystal violet), Amido black and Hungarian red. Fingermark in blood on the surface of thermal paper can be fixed with 2% 5-sulfosalicylic acid solution. LCV was found as an inadequate blood staining reagent because of bubbling, diffusion, and blurring on the surface of thermal paper. Hungarian red was also an inadequate blood staining reagent because excess Hungarian red on the surface of thermal paper was not washed away in the de-staining procedure. Amido black was the best staining reagent among three staining reagents compared. The maximum dilution ratio visible to the naked eye after Amido black staining was 1 in 80 for the thermally sensitive surface and 1 in 20 for the thermally non-sensitive surface.

When visible on a fingermark, the general pattern maintains its importance in the fingerprint examination procedure, since the difference between the general pattern of a fingermark and a fingerprint is sufficient for exclusion. In the current work, the importance of the general pattern is extended

When visible on a fingermark, the general pattern maintains its importance in the fingerprint examination procedure, since the difference between the general pattern of a fingermark and a fingerprint is sufficient for exclusion. In the current work, the importance of the general pattern is extended

Full Text Available A number of methods have been reported in the literature for the development of latent fingermarks on different surfaces. This paper reports a new and simple powdering method which is non-toxic and has been employed on different substrates successfully for the development and visualization of latent fingermarks up to the time period of 6 days in varying temperature conditions. In this investigation a less expensive, simple and easily available fuller’s earth (Multani Mitti powder has been used to decipher the latent fingermarks on different substrates namely black cardboard box, clear glass, coverslip box, steel surface, laminated wooden sheet, clear plastic, colored plastic bag and surface of highlighter pen. It is observed that it gives very clear results on majority of substrates and can be successfully used for the development and visualization of latent fingermarks.

The measurement techniques of femtosecond spectroscopy are effective method to investigate ultrafast dynamics, they are widely used in the fields of physics, chemistry and biology. In this paper, the principle, experiment setup and the approaches to deal with the experiment data were presented. Then different measurement techniques such as transient absorption spectroscopy, photon echoes, optical Kerr effect and degenerate four-wave mixing were explained with special examples. At last, the application prospect of measurement techniques of femtosecond spectroscopy was forecasted.

This paper implements and compares different techniques for face detection and recognition. One is find where the face is located in the images that is face detection and second is face recognition that is identifying the person. We study three techniques in this paper: Face detection using self organizing map (SOM), Face recognition by projection and nearest neighbor and Face recognition using SVM.

Full Text Available There are now many molecular biological techniques available to define HLA class I and class II alleles. Some of these are also applicable to other human polymorphic genes, in particular to those non-HLA genes encoded within the Mhc. The range of techniques available allows laboratories to choose those most suited to their purpose. The routine laboratory supporting solid organ transplants will need to type large numbers of potential recipients over a period of time, probably using PCR-SSOP while donors will be typed singly and rapidly using PCR-SSP with HLA allele compatibility determined by heteroduplex analysis. Laboratories supporting bone marrow transplantation, where time is less pressing, can choose from the whole range of techniques to determine accurately donor recipient Mhc compatibility. For disease studies, techniques defining precise HLA allele sequence polymorphisms are needed and high sample numbers have to be accommodated. When an association is established allele sequencing has to be used. In the near future, the precise role of HLA alleles in transplantation and disease susceptibility is likely to be established unambiguously.

As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples using the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H2SO4, at λmax = 419.5 nm.

The aqueous synthesis of mercaptosuccinic acid (MSA) capped CdTe quantum dots (QDs) solution for quickly and sensitively developing latent fingermarks is described. The rapid growth mechanism of CdTe/MSA QDs, which depends on the molecule structure of MSA, is briefly discussed and compared with that of thioglycolic acid (TGA) and mercaptopropionic acid (MPA) capped CdTe QDs. Development of latent fingermarks with the synthesized CdTe/MSA QDs was faster and the ridge details were clearer compared with CdTe/TGA QDs. In addition, latent fingermarks developed with CdTe/MSA QDs showed less background and better contrast than that of gentian violet or rhodamine 6G. Latent fingermarks could be well developed on black tape, scotch tape, tinfoil, aluminum alloy, stainless steel as well as on the adhesive side of yellow tape, even when the latter were aged up to seven days. As immersion time greatly reduced to 10 s by using CdTe/MSA QDs, a preliminary result of latent fingermark development by spraying was presented also.

Full Text Available Malwares are malignant softwares. It is designed to damage computer systems without the knowledge of the owner using the system. Softwares from reputable vendors also contain malicious code that affects the system or leaks informations to remote servers. Malwares includes computer viruses, Worms, spyware, dishonest ad-ware, rootkits, Trojans, dialers etc. Malware is one of the most serious security threats on the Internet today. In fact, most Internet problems such as spam e-mails and denial of service attacks have malwareas their underlying cause. Computers that are compromised with malware are often networked together to form botnets and many attacks are launched using these malicious, attacker controlled networks. The paper focuses on various Malware detection and removal methods.

1,2-Indanedione has been extensively researched since the discovery of its fluorogenic reaction with amino acids in 1997 by Joullié et al. [1]. This current study compares the development of fingermarks on used train tickets by the three leading reagents for amino acids-ninhydrin, DFO and 1,2-indanedione. The train tickets are ideal for the task due to their high abundance and frequent use by a diverse population. However, their unique double-layer composition of a cellulose-based regular paper on one side and a thermally sensitive layer on the other requires an adjustment of the traditional development procedures. Heat, which is normally applied after dipping the specimens in the reagents solutions, had to be avoided due to darkening of the sensitive thermal layer. Instead, it has been replaced by air-drying in a fume-hood 24h prior to the recording of the results. Three groups, each containing 500 used train tickets had been treated by each of the three reagents. The results were expressed in terms of percentage of both comparable and partial fingermarks. In this study we controlled neither the quality of the fingerprint donors nor the conditions under which the latent fingermarks had been deposited or stored. However, the large number of similar exhibits which are randomly chosen allows tentative conclusions on the potential of each reagent, hence, a new criterion for the potential of fingermark development (PFD) is proposed. The PFD combines all the partial fingermarks and identifiable fingermarks (graded 1 and 2) thus, highlighting the sensitivity of the reagents. In this work, the superiority of 1,2-indanedione is demonstrated using both the traditional comparison tests as well as the suggested "PFD".

Embolic detection is very important to the early diagnosis of vessel disease. The Doppler ultrasound technique is one of the common methods to detect the emboli non-invasively. When the emboli pass through the sample volume of the Doppler ultrasound instrument, there exist high intensity transient Doppler signals. Thus the emboli can be detected directly from the variation of Doppler signal amplitude. Since there may be some disturbance in the system, this general detection method has great limitation. To improve the accuracy of emboli auto-detection, several novel methods are studied to obtain the sensitive characteristic of the emboli signals using the new signal processing theories.

The powder technique for detecting latent fingerprints involves the application of a finely divided formulation to the fingermark impression, generally with a glass-fibre or a camel hair brush. The powder gets mechanically adhered to the sweat residue defining the ridge pattern. The furrows which are devoid of the fingerprint residue, do not adhere the powder onto them. The final outcome is that the powder formulation sticks to the ridges, but is easily blown off the furrows. Since the powder is normally coloured, the ridge pattern becomes visible and the latent print is said to have developed.

This paper discusses the problem of detecting explosives in the context of an object being transported for illicit purposes. The author emphasizes that technologies developed for this particular application have payoffs in many related problem areas. The author discusses nuclear techniques which can be applied to this detection problem. These include: x-ray imaging; neutronic interrogation; inelastic neutron scattering; fieldable neutron generators. He discusses work which has been done on the applications of these technologies, including results for detection of narcotics. He also discusses efforts to integrate these techniques into complementary systems which offer improved performance.

Expert-based methods are used from the beginning of the 20th century for forensic evaluation of fingermarks (trace specimens) and fingerprints (reference specimens). Currently semi-automatic systems using biometric data, biometric technology and statistical models are developed to support the

Expert-based methods are used from the beginning of the 20th century for forensic evaluation of fingermarks (trace specimens) and fingerprints (reference specimens). Currently semi-automatic systems using biometric data, biometric technology and statistical models are developed to support the expert

A method that incorporates edge detectiontechnique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detectiontechnique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

Accurate localization of microwave emitting sources is an important topic, in telecommunication and remote sensing. Especially, in order to increase the data quality of satellite microwave remote sensing, radio frequency interference (RFI) sources need to be accurately geolocated. This project offers the opportunity to learn and experience the method for high-performance detection and localization of RFI sources. Study of radio frequency interference definition and detectiontechniques. ...

Some optical fiber hydrophones, such as PGC Mach-Zehnder Interferometer, have a birefringence of single mode optical fibers which induce signal fading. Especially, if two optical beams from the optical arms are orthogonal, the interferomic signal can' t be detected at all. Here a new method is introduced. This is to translate the detected phase difference into a linearly polarized angle, then detect it, so that polarization inducing signal fading will be avoided. In theory, this problem is solved. Furthermore, the effect on measurement results from optical source fluctuation becomes little when using the polarization technique.

Full Text Available A new technique for multipath detection in wideband mobile radio systems is presented. The proposed scheme is based on an intelligent search algorithm using Boolean Satisfiability (SAT techniques to search through the uncertainty region of the multipath delays. The SAT-based scheme utilizes the known structure of the transmitted wideband signal, for example, pseudo-random (PN code, to effectively search through the entire space by eliminating subspaces that do not contain a possible solution. The paper presents a framework for modeling the multipath detection problem as a SAT application. It also provides simulation results that demonstrate the effectiveness of the proposed scheme in detecting the multipath components in frequency-selective Rayleigh fading channels.

Full Text Available Recently, image processing techniques are widely used in several medical areas for image improvement in earlier detection and treatment stages, where the time factor is very important to discover the abnormality issues in target images, especially in various cancer tumours such as lung cancer, breast cancer, etc. Image quality and accuracy is the core factors of this research, image quality assessment as well as improvement are depending on the enhancement stage where low pre-processing techniques is used based on Gabor filter within Gaussian rules. Following the segmentation principles, an enhanced region of the object of interest that is used as a basic foundation of feature extraction is obtained. Relying on general features, a normality comparison is made. In this research, the main detected features for accurate images comparison are pixels percentage and mask-labelling.

A variety of change detection (CD) methods have been developed and employed to support imagery analysis for applications including environmental monitoring, mapping, and support to military operations. Evaluation of these methods is necessary to assess technology maturity, identify areas for improvement, and support transition to operations. This paper presents a methodology for conducting this type of evaluation, discusses the challenges, and illustrates the techniques. The evaluation of object-level change detection methods is more complicated than for automated techniques for processing a single image. We explore algorithm performance assessments, emphasizing the definition of the operating conditions (sensor, target, and environmental factors) and the development of measures of performance. Specific challenges include image registration; occlusion due to foliage, cultural clutter and terrain masking; diurnal differences; and differences in viewing geometry. Careful planning, sound experimental design, and access to suitable imagery with image truth and metadata are critical.

Full Text Available Networking has become the most integral part of our cyber society. Everyone wants to connect themselves with each other. With the advancement of network technology, we find this most vulnerable to breach and take information and once information reaches to the wrong hands it can do terrible things. During recent years, number of attacks on networks have been increased which drew the attention of many researchers on this field. There have been many researches on intrusion detection lately. Many methods have been devised which are really very useful but they can only detect the attacks which already took place. These methods will always fail whenever there is a foreign attack which is not famous or which is new to the networking world. In order to detect new intrusions in the network, researchers have devised artificial intelligence technique for Intrusion detection prevention system. In this paper we are going to cover what types evolutionary techniques have been devised and their significance and modification.

Full Text Available Large quantity of explosive is manufactured worldwide for use in various types of ammunition,arms, and mines, and used in armed conflicts. During manufacturing and usage of the explosiveequipment, some of the explosive residues are released into the environment in the form ofcontaminated effluents, unburnt explosives fumes and vapours. Limited but uncontrolledcontinuous release of trace vapours also takes place when explosive-laden landmines are deployedin the field. One of the major technological challenges in post-war scenario worldwide is thedetection of landmines using these trace vapour signatures and neutralising them safely. Differenttypes of explosives are utilised as the main charge in antipersonnel and antitank landmines. Inthis paper, an effort has been made to review the techniques so far available based on explosivevapour detection especially to detect the landmines. A comprehensive compilation of relevantinformation on the techniques is presented, and their maturity levels, shortcomings, and difficultiesfaced are highlighted.

Geophysical technologies are very effective in environmental, engineering and groundwater applications. Parameters of delineating nature of near-surface materials such as compressional-wave velocity, shear-wave velocity can be obtained using shallow seismic methods. Electric methods are primary approaches for investigating groundwater and detecting leakage. Both of the methods are applied to detect embankment in hope of obtaining evidence of the strength and moisture inside the body. A technological experiment has been done for detecting and discovering the hidden troubles in the embankment of Yangtze River,Songzi, Hubei, China in 2003. Surface-wave and DC multi-channel array resistivity sounding techniques were used to detect hidden trouble inside and under dike like pipe-seeps. This paper discusses the exploration strategy and the effect of geological characteristics. A practical approach of combining seismic and electric resistivity measurements was applied to locate potential pipe-seeps in embankment in the experiment. The method presents a potential leak factor based on the shear-wave velocity and the resistivity of the medium to evaluate anomalies. An anomaly found in a segment of embankment detected was verified, where occurred a pipe-seep during the 98′ flooding.

The aim of this study is to use the Resonance Raman (RR) and fluorescence spectroscopic technique for tumor margin detection with high accuracy based on native molecular fingerprints of breast and gastrointestinal (GI) tissues. This tumor margins detection method utilizes advantages of RR spectroscopic technique in situ and in real-time to diagnose tumor changes providing powerful tools for clinical guiding intraoperative margin assessments and postoperative treatments. The tumor margin detection procedures by RR spectroscopy were taken by scanning lesion from center or around tumor region in ex-vivo to find the changes in cancerous tissues with the rim of normal tissues using the native molecular fingerprints. The specimens used to analyze tumor margins include breast and GI carcinoma and normal tissues. The sharp margin of the tumor was found by the changes of RR spectral peaks within 2 mm distance. The result was verified using fluorescence spectra with 300 nm, 320 nm and 340 nm excitation, in a typical specimen of gastric cancerous tissue within a positive margin in comparison with normal gastric tissues. This study demonstrates the potential of RR and fluorescence spectroscopy as new approaches with labeling free to determine the intraoperative margin assessment.

In the present age of heightened emphasis on counter terrorism, law enforcement and forensic science are constantly evolving and adapting to the motivations and capabilities of terrorist groups and individuals. The use of biological agents on a population, such as anthrax spores, presents unique challenges to the forensic investigator, and the processing of contaminated evidence. In this research, a number of porous and non-porous items were contaminated with viable [corrected] spores and marked with latent fingermarks. The test samples were then subjected to a standard formulation of formaldehyde gas. Latent fingermarks were then recovered post decontamination using a range of methods. Standard fumigation, while effective at destroying viable spores, contributed to the degradation of amino acids leading to loss of ridge detail. A new protocol for formaldehyde gas decontamination was developed which allows for the destruction of viable spores and the successful recovery of latent marks, all within a rapid response time of less than 1 h.

Full Text Available The quadtree, a hierarchical data structure for the representation of spatial information based on the principle of recursive decomposition, is widely used in digital image processing and computer graphics. This paper demonstrates the detection of invisible watermarked images generated by popular watermarking techniques, including CDMA, DCT, DWT, and Least Significant Bit (LSB using quadtree. Results corresponding to typical (512 × 512 pixel images show differences among these methods when they are used. Each time we use the same image, the original images and invisible watermarked image to test the four methods in conjunction with quadtree decomposition. In addition to the subjective method represented by quadtree, many objective evaluations such as Pearson correlation, mean square error (MSE, Structural SIMilarity Index (SSIM and false positive and false negative were used to give the comparison criteria between original and watermarked images. In results, the quadtree decomposition considered a promise subjective method to recognize among these watermark techniques.

Full Text Available Internet worms pose a serious threat to computer security. Traditional approaches using signatures to detect worms pose little danger to the zero day attacks. The focus of malware research is shifting from using signature patterns to identifying the malicious behavior displayed by the malwares. This paper presents a novel idea of extracting variable length instruction sequences that can identify worms from clean programs using data mining techniques. The analysis is facilitated by the program control flow information contained in the instruction sequences. Based upon general statistics gathered from these instruction sequences we formulated the problem as a binary classification problem and built tree based classifiers including decision tree, bagging and random forest. Our approach showed 95.6% detection rate on novel worms whose data was not used in the model building process.

Malware is a type of malicious program that replicate from host machine and propagate through network. It has been considered as one type of computer attack and intrusion that can do a variety of malicious activity on a computer. This paper addresses the current trend of malware detectiontechniques and identifies the significant criteria in each technique to improve malware detection in Intrusion Detection System (IDS). Several existing techniques are analyzing from 48 various researches and the capability criteria of malware detectiontechnique have been reviewed. From the analysis, a new generic taxonomy of malware detectiontechnique have been proposed named Hybrid Malware DetectionTechnique (Hybrid MDT) which consists of Hybrid Signature and Anomaly detectiontechnique and Hybrid Specification based and Anomaly detectiontechnique to complement the weaknesses of the existing malware detectiontechnique in detecting known and unknown attack as well as reducing false alert before and during the intrusion ...

Luciferin-Luciferase (L-L) luminescence techniques were used to successfully measure adenosine triphosphate (ATP) (pg/ml) in concentrated aerosol samples containing either vegetative bacterial cells or spores. Aerosols were collected with wet and dry sampling devices. Evaluation for the presence of total bio-mass from bacterial and non-bacterial sources of ATP was achieved by suspending the collected aerosol samples in phosphate buffered saline (PBS), pipeting a 50-(mu) l aliquot of the PBS suspension into a FiltravetteTM, and then adding bacterial releasing agent (BRA). The sample was then reacted with L-L, and the resulting Relative Luminescence Units (RLU's), indicative of ATP from all sources, were measured. Bacterial cells were enumerated with the additional application of a wash with somatic cell releasing agent (SRA) to remove any interferences and non-bacterial sources of ATP prior to BRA application. This step removes interfering substances and non-bacterial sources of ATP. For spore analysis, an equi-volume sample of the PBS suspension was added to an equi-volume of trypticase soy broth (TSB), incubated at 37 C for 15 minutes, and processed using methods identical to bacterial cell analysis. Using these technique we were able to detect Bacillus subtilin variation niger, formerly known as Bacillus globigii (BG), in aerosol samples at concentrations greater than or equal to 105 colony forming units (CFU) per ml. Results of field and chamber trials show that one can detect the presence of bacterial and non-bacterial sources of ATP. One can also differentiate spore and vegetative bacterial cells. These techniques may be appropriate to situations where the measurement of bacterial aerosols is needed.

Recent studies have reported the use of alginate in the lifting and subsequent enhancement of footwear marks in blood. A study was set up to assess the use of such a method in the treatment of fingermarks in blood on a variety of porous, non-porous and semi-porous surfaces. Other variables included ageing of the fingermarks in blood and the application of chemicals prior to or post-alginate lifting. All different variations were compared to direct chemical treatment of the substrate. The results demonstrated that alginate is not compatible with certain substrates (e.g. glass and tile). On substrates that were compatible with alginate (e.g. fabric and paper), the enhanced fingermarks on the alginate cast and the enhanced fingermarks on the post-alginate substrates appeared, overall, inferior compared to direct chemical enhancement without the use of alginate. A further variation using water-based protein stains directly mixed with the alginate appeared to provide enhancement directly on the substrate as well as simultaneous lifting and enhancing the fingermarks in blood on the alginate cast.

This book examines the design of chipless RFID systems. The authors begin with the historical development of wireless identification systems and finally arrive at a representation of the chipless RFID system as a block diagram illustration. Chapter 2 is devoted to the theoretical bases for the design of chipless RFID tags and detectiontechniques in the reader. A rigorous mathematical formulation is presented based on the singularity expansion method (SEM) and characteristic mode theory (CMT) in order to study the scattered fields from an object in a general form. Th e authors attempt to explain some physical concepts behind the mathematical descriptions of the theories in this chapter. In Chapter 3, two design procedures based on complex natural resonance and CMT are presented for the design of the chipless RFID tag. By studying the effects of structural parameters on radiation and resonant behaviors of the tag, some design conclusions are presented in this chapter. Chapter 4 is dedicated to the time-frequen...

Capillary electrophoresis (CE) has emerged as one of the most versatile separation methods. However, efficient separation is not sufficient unless coupled to adequate detection. The narrow inner diameter (I.D.) of the capillary column raises a big challenge to detection methods. For UV-vis absorption detection, the concentration sensitivity is only at the μM level. Most commercial CE instruments are equipped with incoherent UV-vis lamps. Low-brightness, instability and inefficient coupling of the light source with the capillary limit the further improvement of UV-vis absorption detection in CE. The goals of this research have been to show the utility of laser-based absorption detection. The approaches involve: on-column double-beam laser absorption detection and its application to the detection of small ions and proteins, and absorption detection with the bubble-shaped flow cell.

Autonomous obstacle detection and avoidance (AODA) techniques is prerequisite for future pinpoint lunar landing missions. Information weighted fusion hazard detection algorithms are firstly proposed to improve the success probability of obstacle detection. Secondly, guidance law for constant-thrust engine is designed to avoid the detected obstacles and steer the lander to the safe landing site. Finally, the validity of the proposed obstacle detection and avoidance techniques are confirmed by computer simulation.

Detection of edge is a terminology in image processing and computer vision particularly in the areas of feature detection and extraction to refer to the algorithms which aims at identifying points in a digital image at which the image brightness changes sharply or more formally has discontinuities. The need of edge detection is to find the discontinuities in depth, discontinuities in surface orientation, changes in material properties and variations in scene illumination. Remote sensing image...

Lidar (light detection and ranging) presents better sensitivity than fire surveillance based on imaging. However, the price of conventional lidar equipment is often too high as compared to passive fire detection instruments. We describe possibilities to downscale the technology. First, a conventional lidar, capable of smoke-plume detection up to ~10 km, may be replaced by an industrially manufactured solid-state laser rangefinder. This reduces the detection range to about 5 km, but decreases the purchase price by one order of magnitude. Further downscaling is possible by constructing the lidar smoke sensor on the basis of a low-cost laser diode.

A method of customization stimulation signal based on direct digital frequency synthesis (DDS) for Nuclear Quadrapole Resonance Explosives Detection System is presented. DDS has many advantages, such as high frequency resolution, high convert speed,

A detection method and system utilizing silicon models of the traveling wave structure of the human cochlea to spatially and temporally locate a specific sound source in the presence of high noise pandemonium. The detection system combines two-dimensional stereausis representations, which are output by at least three VLSI binaural hearing chips, to generate a three-dimensional stereausis representation including both binaural and spectral information which is then used to locate the sound source.

The task of detecting human faces within either a still image or a video frame is one of the most popular object detection problems. For the last twenty years researchers have shown great interest in this problem because it is an essential pre-processing stage for computing systems that process human faces as input data. Example applications include face recognition systems, vision systems for autonomous robots, human computer interaction systems (HCI), surveillance systems, biometric based a...

Full Text Available Proper heat detection to achieve appropriate timing of insemination is the biggest restriction in attaining high conception rate in dairy herd. The estrus detection is the key issue to be considered on priority basis. Inefficient heat detection reduces the fertility status of herd. Interventions in existing management practices in farm can manifest estrus with clarity. Manifestation of estrus is due to effect of estrogen on Central Nervous System (CNS. The standing to be ridden is the best reliable sign of estrus. Ovulation time is well estimated by standing heat. Different factors are responsible which affect estrous behavior, out of which feeding and management interventions are one of the most important factors. For improving efficiency of heat detection in animal visual observation is best method, if it is done thrice a day for 30 minutes every time. However heat detection aids, if used in combination give better results as compared to visual observation. The progesterone (P4 estimation in milk and ultrasound monitoring of ovary and reproductive tracts for estimation of ovulation time are other important methods. Ovulation is very important point in dairy reproduction management. The optimum time for insemination is before the actual ovulation. It is already established that, estrus detection alone contributes considerably to reproductive status of the herd, therefore the need of the hour is critical observation of dairy herd to reduce incidence of unnoticed estrus. [Vet World 2013; 6(6.000: 363-369

Getting to wounded soldiers on the battlefield is a precarious task, and medics have a very high casualty rate. It is therefore a vital importance to prioritize which soldiers to attend to first. The first step is to detect life signs - if a soldier is dead or alive, and prioritize recovery of live soldiers. The second step is to obtain vital signs from live soldiers, and use this to prioritize which are in most urgent need of attention. Our team at Kai Sensors, University of Hawaii and University of Florida is developing Doppler radar heart sensing technology that provides the means to detect life signs, respiration and/or heart beat, at a distance, even for subjects lying motionless, e.g., unconscious subjects, wearing body armor, and hidden from direct view. Since this technology can deliver heart rate information with high accuracy, it may also enable the assessment of a subject's physiological and psychological state based on heart rate variability (HRV) analysis. Thus, the degree of a subject's injury may also be determined. The software and hardware developments and challenges for life signs detection and monitoring for battlefield triage will be discussed, including heart signal detection from all four sides of the human body, detection in the presence of body armor, and the feasibility of HRV parameter extraction.

Numerous gel-based and nongel-based technologies are used to detect protein changes potentially associated with disease. The raw data, however, are abundant with technical and structural complexities, making statistical analysis a difficult task. Low-level analysis issues (including normalization, background correction, gel and/or spectral alignment, feature detection, and image registration) are substantial problems that need to be addressed, because any large-level data analyses are contingent on appropriate and statistically sound low-level procedures. Feature detection approaches are particularly interesting due to the increased computational speed associated with subsequent calculations. Such summary data corresponding to image features provide a significant reduction in overall data size and structure while retaining key information. In this paper, we focus on recent advances in feature detection as a tool for preprocessing proteomic data. This work highlights existing and newly developed feature detection algorithms for proteomic datasets, particularly relating to time-of-flight mass spectrometry, and two-dimensional gel electrophoresis. Note, however, that the associated data structures (i.e., spectral data, and images containing spots) used as input for these methods are obtained via all gel-based and nongel-based methods discussed in this manuscript, and thus the discussed methods are likewise applicable.

Purpose of review As patients with impaired kidney function are at increased risk not only for progressive renal function loss, but also for cardiovascular disease, it is of importance to have accurate techniques to screen patients for the presence of an impaired kidney function. Recent findings Glo

textabstractThe studies reported in this thesis were performed to answer the central question: can intracoronary thermography be used for vulnerable plaque detection? To answer this question, we have identified parameters that influence intracoronary thermography measurements, and have studied to w

Laser photoacoustic instrument detects small amounts of oxidation in polymers. Instrument used to evaluate resistance to oxidation in Sunlight of polymer encapsulants for solar-cell arrays. With instrument, researchers monitor samples for early stages of photooxidation and study primary mechanisms of oxidation and degradation. Effects of these mechanisms masked during later stages.

Full Text Available Sensing technology has been developed for detection of gases in some environmental, industrial, medical, and scientific applications. The main tasks of these works is to enhance performance of gas sensors taking into account their different applicability and scenarios of operation. This paper presents the descriptions, comparison and recent progress in some existing gas sensing technologies. Detailed introduction to optical sensing methods is presented. In a general way, other kinds of various sensors, such as catalytic, thermal conductivity, electrochemical, semiconductor and surface acoustic wave ones, are also presented. Furthermore, this paper focuses on performance of the optical method in detecting biomarkers in the exhaled air. There are discussed some examination results of the constructed devices. The devices operated on the basis of enhanced cavity and wavelength modulation spectroscopies. The experimental data used for analyzing applicability of these different sensing technologies in medical screening. Several suggestions related to future development are also discussed.

Faced with imminent spectrum scarcity largely due to inflexible licensed band arrangements, cognitive radio (CR) has been proposed to facilitate higher spectrum utilization by allowing cognitive users (CUs) to access the licensed bands without causing harmful interference to primary users (PUs). To achieve this without the aid of PUs, the CUs have to perform spectrum sensing reliably detecting the presence or absence of PU signals. Without reliable spectrum sensing, the discovery of spectrum ...

Full Text Available This paper reviews the works found in the literature in the field of Transportation Mode Detection (TMD which is a subfield of Activity Recognition aiming at indentifying (i.e. classifying the mean of transportation a person is using. The solutions found in literature have different characteristics according to the device for which the solution was tailored (smartphones or other systems such as, e.g., GPS loggers and to the algorithm used for the classification task. This may vary a lot according to the number and type of input used (e.g. accelerations, GPS, maps information or GIS - Geographical Information System information and to the identified classes of transportation mode. These two aspects are the most relevant to consider when evaluating and comparing the accuracies claimed by each work. A comparison of the works is proposed taking into account the characteristics discussed above. In general the accelerometer is the most widely used sensor for TMD applications, as it limits battery consumption and captures relevant features for detecting motion. Indeed a key challenge in TMD is to detect different motorized classes such as bus, car, train and metro because they share common characteristics (such as e.g. the average speed and accelerations which make hard identifying suitable features for the classification algorithm. Identifying the “walk” and “stationary” transportation modes is a simpler task because they are characterized by distinct features.

This paper focuses on machine learning techniques for real-time detection. Although many supervised learning techniques have been described in the literature, no technique always performs best. Several comparative studies are available, but have not always been performedcarefully, leading to invalid

This paper focuses on machine learning techniques for real-time detection. Although many supervised learning techniques have been described in the literature, no technique always performs best. Several comparative studies are available, but have not always been performedcarefully, leading to invalid

Measurements of the top surface vibration of a buried (inert) VS 2.2 anti-tank plastic landmine reveal significant resonances in the frequency range between 80 and 650 Hz. Resonances from measurements of the normal component of the acoustically induced soil surface particle velocity (due to sufficient acoustic-to-seismic coupling) have been used in detection schemes. Since the interface between the top plate and the soil responds nonlinearly to pressure fluctuations, characteristics of landmines, the soil, and the interface are rich in nonlinear physics and allow for a method of buried landmine detection not previously exploited. Tuning curve experiments (revealing "softening" and a back-bone curve linear in particle velocity amplitude versus frequency) help characterize the nonlinear resonant behavior of the soil-landmine oscillator. The results appear to exhibit the characteristics of nonlinear mesoscopic elastic behavior, which is explored. When two primary waves f1 and f2 drive the soil over the mine near resonance, a rich spectrum of nonlinearly generated tones is measured with a geophone on the surface over the buried landmine in agreement with Donskoy [SPIE Proc. 3392, 221-217 (1998); 3710, 239-246 (1999)]. In profiling, particular nonlinear tonals can improve the contrast ratio compared to using either primary tone in the spectrum.

Myxomycetes are organisms characterized by a life cycle that includes a fruiting body stage. Myxomycete fruiting bodies contain spores, and wind dispersal of the spores is considered important for this organism to colonize new areas. In this study, the presence of airborne myxomycetes and the temporal changes in the myxomycete composition of atmospheric particles (aerosols) were investigated with a polymerase chain reaction (PCR)-based method for Didymiaceae and Physaraceae. Twenty-one aerosol samples were collected on the roof of a three-story building located in Sapporo, Hokkaido Island, northern Japan. PCR analysis of DNA extracts from the aerosol samples indicated the presence of airborne myxomycetes in all the samples, except for the one collected during the snowfall season. Denaturing gradient gel electrophoresis (DGGE) analysis of the PCR products showed seasonally varying banding patterns. The detected DGGE bands were subjected to sequence analyses, and four out of nine obtained sequences were identical to those of fruiting body samples collected in Hokkaido Island. It appears that the difference in the fruiting period of each species was correlated with the seasonal changes in the myxomycete composition of the aerosols. Molecular evidence shows that newly formed spores are released and dispersed in the air, suggesting that wind-driven dispersal of spores is an important process in the life history of myxomycetes. This study is the first to detect airborne myxomycetes with the use of molecular ecological analyses and to characterize their seasonal distribution.

Full Text Available Several km-scale gravitational-wave detectors have been constructed world wide. These instruments combine a number of advanced technologies to push the limits of precision length measurement. The core devices are laser interferometers of a new kind; developed from the classical Michelson topology these interferometers integrate additional optical elements, which significantly change the properties of the optical system. Much of the design and analysis of these laser interferometers can be performed using well-known classical optical techniques, however, the complex optical layouts provide a new challenge. In this review we give a textbook-style introduction to the optical science required for the understanding of modern gravitational wave detectors, as well as other high-precision laser interferometers. In addition, we provide a number of examples for a freely available interferometer simulation software and encourage the reader to use these examples to gain hands-on experience with the discussed optical methods.

Several km-scale gravitational-wave detectors have been constructed worldwide. These instruments combine a number of advanced technologies to push the limits of precision length measurement. The core devices are laser interferometers of a new kind; developed from the classical Michelson topology these interferometers integrate additional optical elements, which significantly change the properties of the optical system. Much of the design and analysis of these laser interferometers can be performed using well-known classical optical techniques; however, the complex optical layouts provide a new challenge. In this review, we give a textbook-style introduction to the optical science required for the understanding of modern gravitational wave detectors, as well as other high-precision laser interferometers. In addition, we provide a number of examples for a freely available interferometer simulation software and encourage the reader to use these examples to gain hands-on experience with the discussed optical methods.

Full Text Available In current business scenario, critical data is to be shared and transferred by organizations to many stake holders in order to complete particular task. The critical data include intellectual copyright, patient information etc. The activities like sharing and transferring of such critical data includes threats like leakage of information, misuse of data, illegal access to data and/or alteration of data. It is necessary to deal with such problem efficiently and effectively, popular solutions to this problem are use of firewalls, data loss prevention tools and watermarking. But sometimes culprit succeeds in overcoming such security measures hence, if organizations becomes able to find out the guilty client responsible for leakage of particular data then risk of data leakage is reduced. For this many systems are proposed, this paper includes information about techniques discussed in some of such methodologies.

In satellite and airborne remote sensing, hyperspectral technique has become a very powerful tool, due to the possibility of rapidly realizing chemical/mineralogical maps of the studied areas. Many studies are trying to customize the algorithms to identify several geo-physical soil properties. The specific objective of this study is to investigate those soil characteristics, such as clay mineral content, influencing degradation processes (soil erosion and shallow landslides), by means of correlation analysis, in order to examine the possibility of predicting the selected property using high-resolution reflectance spectra and images. The study area is located in the Mugello basin, about 30 km north of Firenze (Tuscany, Italy). Agriculturally suitable terrains are assigned mainly to annual crops, marginally to olive groves, vineyards and orchards. Soils mostly belong to Regosols and Cambisols orders. An ASD FieldSpec spectroradiometer was used to obtain reflectance spectra from about 80 dried, crushed and sieved samples under controlled laboratory conditions. Samples were collected simultaneously with the flight of SIM.GA hyperspectral camera from Selex Galileo, over an area of about 5 km2 and their positions were recorded with a differential GPS. The quantitative determination of clay minerals content was performed by means of XRD and Rietveld refinement. Different chemometric techniques were preliminarily tested to correlate mineralogical records with reflectance data. A one component partial least squares regression model yielded a preliminary R2 value of 0.65. A slightly better result was achieved by plotting the absorption peak depth at 2210 versus total clay content (band-depth analysis). The complete SIM.GA hyperspectral geocoded row dataset, with an approximate pixel resolution of 0.6 m (VNIR) and 1.2 m (SWIR), was firstly transformed into at sensor radiance values, by applying calibration coefficients and parameters from laboratory measurements to non

Networking systems and individual applications have traditionally been defended using signature-based tools that protect the perimeter, many times to the detriment of service, performance, and information flow. These tools require knowledge of both the system on which they run and the attack they are preventing. As such, by their very definition, they only account for what is known to be malicious and ignore the unknown. The unknown, or zero day threat, can occur when defenses have yet to be immunized via a signature or other identifier of the threat. In environments where execution of the mission is paramount, the networks and applications must perform their function of information delivery without endangering the enterprise or losing the salient information, even when facing zero day threats. In this paper we, describe a new defensive strategy that provides a means to more deliberately balance the oft mutually exclusive aspects of protection and availability. We call this new strategy Protection without Detection, since it focuses on network protection without sacrificing information availability. The current instantiation analyzes the data stream in real time as it passes through an in-line device. Critical files are recognized, and mission-specific trusted templates are applied as they are forwarded to their destination. The end result is a system which eliminates the opportunity for propagation of malicious or unnecessary payloads via the various containers that are inherent in the definition of standard file types. In some cases, this method sacrifices features or functionality that is typically inherent in these files. However, with the flexibility of the template approach, inclusion or exclusion of these features becomes a deliberate choice of the mission owners, based on their needs and amount of acceptable risk. The paper concludes with a discussion of future extensions and applications.

An oil-in-water microemulsion containing a luminescent dye, nile red, has been synthesised using a solvent-diffusion method. This has been demonstrated to be effective in developing fresh latent fingermarks on porous surfaces. The working solution is made using a binary surfactant solution to create a lactescent dual organic-aqueous phase intermediate, which subsequently results in a transparent microemulsion after the organic solvent has evaporated. The solution is non-toxic and performs comparatively with a previously published methanolic formulation but at a much lower cost and with an extended shelf life. The microemulsion outperforms a previously reported aqueous nile blue formulation for the development of both charged and natural fresh fingermarks, and requires lower exposure times for image recording.

Full Text Available Several dental problems can be detected using radiographs but the main issue with radiographs is that they are not very prominent. In this paper, two well known edge detectiontechniques have been implemented for a set of 20 radiographs and number of pixels in each image has been calculated. Further, Gaussian filter has been applied over the images to smoothen the images so as to highlight the defect in the tooth. If the images data are available in the form of pixels for both healthy and decayed tooth, the images can easily be compared using edge detectiontechniques and the diagnosis is much easier. Further, Laplacian edge detectiontechnique is applied to sharpen the edges of the given image. The aim is to detect discontinuities in dental radiographs when compared to original healthy tooth. Future work includes the feature extraction on the images for the classification of dental problems.

A significant body of research on advanced techniques for automated freeway incident detection has been conducted at the University of California, Irvine (UCI). Such advanced pattern recognition techniques as artificial neural networks (ANNs) have been thoroughly investigated and their potential superiority to other techniques has been demonstrated. Of the investigated ANN architectures, two have shown the best potential for real-time implementation: namely, the Probabilistic Neural Network (...

Pickling cucumbers are susceptible to damage during harvest and postharvest handling and processing. While it is easier to detect external defects, it is difficult to detect internal defects such as bruises and hollow or split cucumbers. Hyperspectral imaging technique under transmittance mode was i...

Full Text Available Many Driver Drowsiness Detection systems have been developed using eye and face detection methods but in this paper we have advanced the previous systems by adding the concept of head position technique. Till now none of the systems developed, have used the head position of the driver to detect the drowsiness, every paper have focused only on the face and eye detection concept. The head position technique is a newly introduced feature which enhances the performance of the system to a great extent. The systems capture frame and detects the face and eyes using HAAR-like classifiers, if the face is detected and eyes are closed then head position is monitored for next few frames, if the face is aligning down gradually and continuously then the alarm is activated.

Studies were performed to demonstrate the capability to detect planetary gear and bearing faults in helicopter main-rotor transmissions. The work supported the Operations Support and Sustainment (OSST) program with the U.S. Army Aviation Applied Technology Directorate (AATD) and Bell Helicopter Textron. Vibration data from the OH-58C planetary system were collected on a healthy transmission as well as with various seeded-fault components. Planetary fault detection algorithms were used with the collected data to evaluate fault detection effectiveness. Planet gear tooth cracks and spalls were detectable using the vibration separation techniques. Sun gear tooth cracks were not discernibly detectable from the vibration separation process. Sun gear tooth spall defects were detectable. Ring gear tooth cracks were only clearly detectable by accelerometers located near the crack location or directly across from the crack. Enveloping provided an effective method for planet bearing inner- and outer-race spalling fault detection.

Graphical abstract: -- Highlights: •This is the first review paper focused on the analytical techniques for droplet-based microfluidics. •We summarized the analytical methods used in droplet-based microfluidic systems. •We discussed the advantage and disadvantage of each method through its application. •We also discuss the future development direction of analytical methods for droplet-based microfluidic systems. -- Abstract: In the last decade, droplet-based microfluidics has undergone rapid progress in the fields of single-cell analysis, digital PCR, protein crystallization and high throughput screening. It has been proved to be a promising platform for performing chemical and biological experiments with ultra-small volumes (picoliter to nanoliter) and ultra-high throughput. The ability to analyze the content in droplet qualitatively and quantitatively is playing an increasing role in the development and application of droplet-based microfluidic systems. In this review, we summarized the analytical detectiontechniques used in droplet systems and discussed the advantage and disadvantage of each technique through its application. The analytical techniques mentioned in this paper include bright-field microscopy, fluorescence microscopy, laser induced fluorescence, Raman spectroscopy, electrochemistry, capillary electrophoresis, mass spectrometry, nuclear magnetic resonance spectroscopy, absorption detection, chemiluminescence, and sample pretreatment techniques. The importance of analytical detectiontechniques in enabling new applications is highlighted. We also discuss the future development direction of analytical detectiontechniques for droplet-based microfluidic systems.

In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

In the last decade, droplet-based microfluidics has undergone rapid progress in the fields of single-cell analysis, digital PCR, protein crystallization and high throughput screening. It has been proved to be a promising platform for performing chemical and biological experiments with ultra-small volumes (picoliter to nanoliter) and ultra-high throughput. The ability to analyze the content in droplet qualitatively and quantitatively is playing an increasing role in the development and application of droplet-based microfluidic systems. In this review, we summarized the analytical detectiontechniques used in droplet systems and discussed the advantage and disadvantage of each technique through its application. The analytical techniques mentioned in this paper include bright-field microscopy, fluorescence microscopy, laser induced fluorescence, Raman spectroscopy, electrochemistry, capillary electrophoresis, mass spectrometry, nuclear magnetic resonance spectroscopy, absorption detection, chemiluminescence, and sample pretreatment techniques. The importance of analytical detectiontechniques in enabling new applications is highlighted. We also discuss the future development direction of analytical detectiontechniques for droplet-based microfluidic systems.

Full Text Available Diverse patterns from web data, commonly referred to as web outliers or exceptional cases or noise; exist in many real-world databases. Detection of such outliers is important for numerous applications, such as detecting criminal activities in E-commerce. Outliers are data objects with different characteristics compared to other data objects. Formal definition of outliers is given by D.Hawkins. as “An Outlier is an observation that deviates so much from other observations so that it arouses suspicion that it is generated by a different mechanism”. Detection of such outliers (outlier mining is important for numerous applications, such as detecting criminal activities in E-commerce, video surveillance, weather prediction, intrusion detection and pharmaceutical research. This paper has focus on comparative study of various techniques on Outlier Detection.

Huanglongbing (HLB) is the most destructive disease of citrus worldwide. Monitoring of health and detection of diseases in trees is critical for sustainable agriculture. HLB symptoms are virtually the same wherever the disease occurs. The disease is caused by Candidatus Liberibacter spp., vectored by the psyllids Diaphorina citri Kuwayama and Trioza erytreae. Electron microscopy was the first technique used for HLB detection. Nowadays, scientists are working on the development of new techniques for a rapid HLB detection, as there is no sensor commercially accessible for real-time assessment of health conditions in trees. Currently, the most widely used mechanism for monitoring HLB is exploration, which is an expensive, labor-intensive, and time-consuming process. Molecular techniques such as polymerase chain reaction are used for the identification of HLB disease, which requires detailed sampling and processing procedures. Furthermore, investigations are ongoing in spectroscopic and imaging techniques, profiling of plant volatile organic compounds, and isothermal amplification. This study recognizes the need for developing a rapid, cost-effective, and reliable health-monitoring sensor that would facilitate advancements in HLB disease detection. This paper compares the benefits and limitations of these potential methods for HLB detection.

This thesis aims to investigate the possibility of developing spectroscopic techniques for trace gas detection, with particular emphasis on their applicability to breath analysis and medical diagnostics. Whilst key breath molecules such as methane and carbon dioxide will feature throughout this work, the focus of the research is on the detection of breath acetone, a molecule strongly linked with the diabetic condition. Preliminary studies into the suitability of cavity enhanced absorption...

Full Text Available The problem of detecting the breath activities of a human subject is addressed. A CW signal is used to probe the scene and the MUSIC algorithm is exploited to detect frequency doppler modulation introduced by chest movements. For this particular measurement configuration, the correlation matrix results rank deficient. In order to restore the rank, two decorrelation techniques are compared by exploiting numerical data.

Full Text Available Epidermal growth factor receptor (EGFR gene mutations occur in multiple human cancers; therefore, the detection of EGFR mutations could lead to early cancer diagnosis. This study describes a novel EGFR mutation detectiontechnique. Compared to direct DNA sequencing detection methods, this method is based on allele-specific amplification (ASA, recombinase polymerase amplification (RPA, peptide nucleic acid (PNA, and SYBR Green I (SYBR, referred to as the AS-RPA-PNA-SYBR (ARPS system. The principle of this technique is based on three continuous steps: ASA or ASA combined with PNA to prevent non-target sequence amplification (even single nucleotide polymorphisms, SNPs, the rapid amplification advantage of RPA, and appropriate SYBR Green I detection (the samples harboring EGFR mutations show a green signal. Using this method, the EGFR 19Del(2 mutation was detected in 5 min, while the EGFR L858R mutation was detected in 10 min. In this study, the detection of EGFR mutations in clinical samples using the ARPS system was compatible with that determined by polymerase chain reaction (PCR and DNA sequencing methods. Thus, this newly developed methodology that uses the ARPS system with appropriate primer sets is a rapid, reliable, and practical way to assess EGFR mutations in clinical samples.

A 24 GHz medium-range human detecting sensor, using the Doppler Radar Physiological Sensing (DRPS) technique, which can also detect unmanned aerial vehicles (UAVs or drones), is currently under development for potential rescue and anti-drone applications. DRPS systems are specifically designed to remotely monitor small movements of non-metallic human tissues such as cardiopulmonary activity and respiration. Once optimized, the unique capabilities of DRPS could be used to detect UAVs. Initial measurements have shown that DRPS technology is able to detect moving and stationary humans, as well as largely non-metallic multi-rotor drone helicopters. Further data processing will incorporate pattern recognition to detect multiple signatures (motor vibration and hovering patterns) of UAVs.

In the field of wireless sensor networks, measurements that significantly deviate from the normal pattern of sensed data are considered as outliers. The potential sources of outliers include noise and errors, events, and malicious attacks on the network. Traditional outlier detectiontechniques are

Full Text Available Cracks in concrete or cement based materials present a great threat to any civil structures; they are very dangerous and have caused a lot of destruction and damage. Even small cracks that look insignificant can grow and may eventually lead to severe structural failure. Besides manual inspection that is ineffective and time-consuming, several nondestructive evaluation techniques have been used for crack detection such as ultrasonic technique, vibration technique, and strain-based technique; however, some of the sensors used are either too large in size or limited in resolution. A high resolution microwave imaging technique with ultrawideband signal for crack detection in concrete structures is proposed. A combination of the delay-and-sum beamformer with full-view mounted antennas constitutes the image reconstruction algorithm. Various anomaly scenarios in cement bricks were simulated using FDTD, constructed, and measured in the lab. The reconstructed images showed a high similarity between the simulation and the experiment with a resolution of λ/14 which enables a detection of cracks as small as 5 mm in size.

Improvements in defect detection and sizing capabilities for non-destructive inspection techniques have been required in order to ensure the reliable operation and life extension of nuclear power plants. For the volumetric inspection, the phased array UT technique has superior capabilities for beam steering and focusing to objective regions, and real-time B-scan imaging without mechanical scanning. In contrast to the conventional UT method, high-speed inspection is realized by the unique feature of the phased array technique. A 256-channel array system has developed for the inspection of weldment of BWR internal components such as core shrouds. The TOFD crack sizing technique also can be applied using this system. For the surface inspection, potential drop techniques and eddy current techniques have been improved, which combined the theoretical analysis. These techniques have the crack sizing capability for surface breaking cracks to which UT method is difficult to apply. This paper provides the recent progress of these phased array and electromagnetic inspection techniques.

Non-volcanic tremor (NVT) is a subduction zone process often associated with the transition from stick-slip to stable sliding on the plate interface. In Northern Cascadia, NVT episodes lasting multiple weeks have been correlated spatially and temporally with slow slip episodes at a regular recurrence interval of 15±2 months. However, NVT across the entire Cascadia margin varies widely in recurrence and duration, while still other subduction zones (Japan, Mexico) observe separate cases of tremor and GPS-detected slip. Multiple identification and location techniques exist, but we will focus on techniques that can identify NVT at single stations to accommodate searching outside a dense network of instruments. This study evaluates these techniques in several regions along the entire Cascadia margin and in the Oaxaca segment of the Middle America Subduction Zone. Tremor signals are collected from a mixture of seismometers including those of temporary deployments targeting NVT, the EarthScope Transportable Array, the Canadian National Seismograph Network, and a few other regional networks that span the subduction zones. We compare existing techniques that scan moving averages, scintillation index, and hourly mean amplitudes with a new tremor frequency scanning technique that bandpass filters seismic data into three categories, 10-15 Hz, 2-5 Hz, and 0.2-0.5 Hz, where we expect prominent signals from microseismicity, NVT, and surface waves, respectively. Applying these techniques to episodes over the last few years finds that each technique can identify large, multi-week tremor events associated with GPS recorded slow slip. However, different techniques result in different totals of tremor hours detected per episode, as well as variable numbers of additional smaller episodes identified. Based on previous research that finds the amount of tremor correlated to the amount of slip from geodetic inversions, we are working towards a consensus single station approach that is

In Penning trap mass spectrometry the mass of stored ions is obtained via a determination of the cyclotron frequency ({nu}{sub c}=qB/(2 {pi} m)), for which two different techniques are available. The destructive time-of-flight ion cyclotron resonance (TOF-ICR) technique, based on the measurement of the flight time of excited ions, is the established method for measurements on short-lived radionuclides. It is not ideally suited for rarely produced ion species, since typically some hundred ions are required for a single resonance spectrum. At the Penning trap mass spectrometer TRIGA-TRAP therefore a non-destructive narrow-band Fourier transform ion cyclotron resonance (FT-ICR) detection system is being developed. It is based on the detection of the image currents induced by the stored ions in the trap electrodes and will ultimately reach single ion sensitivity. TRIGA-TRAP also features broad-band FT-ICR detection for the coarse identification of the trap content. Additionally, the TOF-ICR detection system has been recently improved to utilize the Ramsey excitation technique to gain in precision, and the position information of the ion impact to further suppress background events in the final time-of-flight spectrum.

In this paper we propose a new model-based skin artifact cleaning technique with the aim to remove skin reflections with good effectiveness, without introducing significant signal distortions, and without assuming a priori information on the real structure of the breast. The reference cleaning model, constituted by a two-layer geometry skin-adipose tissue, is oriented to all the ultrawideband radar methods able to detect the tumor starting by the knowledge of each trace recorded around the breast. All the radar signal measurements were simulated by using realistic breast models derived from the University of Wisconsin computational electromagnetic laboratory database and the finite difference time domain (FDTD)-based open source software GprMax. First, we have searched for the best configuration for the reference cleaning model with the aim to minimize the distortions introduced on the radar signal. Second, the performance of the proposed cleaning technique has been assessed by using a breast cancer radar detectiontechnique based on the use of artificial neural network (ANN). In order to minimize the signal distortions, we found that it was necessary to use the real skin thickness and the static Debye parameters of both skin and adipose tissue. In such a case the ANN-based radar approach was able to detect the tumor with an accuracy of 87%. By extending the performance assessment also to the case when only average standard values are used to characterize the reference cleaning model, the detection accuracy was of 84%.

Full Text Available An edge may be defined as a set of connected pixels that forms a boundary between two disjoints regions. Image Edge detection reduces the amount of data and filters out useless information, while preserving the important structural properties in an image. Edge detection plays an important role in digital image processing and practical aspects of our daily life. In this paper we studied various edge detectiontechniques as Prewitt, Robert, Sobel, LoG and Canny operators. On comparing them we conclude that canny edge detector performs better than all other edge detectors on various aspects such as it is adaptive in nature, performs better for noisy image, gives sharp edges, low probability of detecting false edges.

Full Text Available The tremendous development in remote sensing technology in the recent past has opened up new challenges in defence applications. On important area of such applications is in target detection. This paper describes both classical and newly developed approaches to detect the targets by using remotely-sensed digital images. The classical approach includes statistical classification methods and image processing techniques. The new approach deals with a relatively new sensor technology, namely, synthetic aperture radar (SAR systems and fast developing tools, like neural networks and multisource data integration for analysis and interpretation. With SAR images, it is possible to detect targets or features of a target that is otherwise not possible. Neural networks and multisource data integration tools also have a great potential in analysing and interpreting remote sensing data for target detection.

The NucleoLink surface is a physically modified, thermostable, optically clear resin. It allows the covalent binding of 5'-phosphorylated oligonucleotides. Target DNA amplification by polymerase chain reaction (PCR) is accomplished by asymmetric amplification on the covalently immobilized primer that develops into immobilized amplicons. A DNA fragment of bovine leukemia virus is used as a model system for the detection of immobilized amplicons by ELISA-like techniques. Covalently bound oligonucleotides are also utilized as capture probe in the hybridization-based signal amplification for detection of an infectious organism.

A novel inductive technique for the detection of microcantilever displacement for sensing applications was presented.We highlight the basic structure and evaluate its characteristics with the aid of modeling and simulation.Results generated by numerical simulations using ANSOFT are compared with those obtained from an equivalent circuit model using PSPICE.There are indications that the sensitivity of the inductive cantilever is about one order of magnitude higher as compared to piezoresistive silicon can...

We report on a unique detection methodology using the Berkeley Visible Image Tube (BVIT) mounted on the 10m Southern African Large Telescope (SALT) to search for laser pulses originating in communications from advanced extraterrestrial (ET) civilizations residing on nearby Earth-like planets located within their habitability zones. The detectiontechnique assumes that ET communicates through high powered pulsed lasers with pulse durations on the order of 5 nanoseconds, the signals thereby being brighter than that of the host star within this very short period of time. Our technique turns down the gain of the optically sensitive photon counting microchannel plate detector such that ~30 photons are required in a 5ns window to generate an imaged event. Picking a priori targets with planets in the habitable zone substantially reduces the false alarm rate. Interplanetary communication by optical masers was first postulated by Schwartz and Townes in 1961. Under the assumption that ET has access to a 10 m class telescope operated as a transmitter then we could detect lasers with a similar power to that of the Livermore Laboratory laser (~1.8Mj per pulse), to a distance of ~ 1000 pc. In this talk we present the results of 2400 seconds of BVIT observations on the SALT of the star Wolf 1061, which is known to harbor an Earth-sized exoplanet located in the habitability zone. At this distance (4.3 pc), BVIT on SALT could detect a 48 joule per pulse laser, now commercially available as tabletop devices.

This research presents ultrasonic techniques for baseline-free damage detection in structures in the context of structural health monitoring (SHM). Conventional SHM methods compare signals obtained from the pristine condition of a structure (baseline signals) with those from the current state, and relate certain changes in the signal characteristics to damage. While this approach has been successful in the laboratory, there are certain drawbacks of depending on baseline signals in real field applications. Data from the pristine condition are not available for most existing structures. Even if they are available, operational and environmental variations tend to mask the effect of damage on the signal characteristics. Most important, baseline measurements may become meaningless while assessing the condition of a structure after an extreme event such as an earthquake or a hurricane. Such events may destroy the sensors themselves and require installation of new sensors at different locations on the structure. Baseline-free structural damage detection can broaden the scope of SHM in the scenarios described above. A detailed discussion on the philosophy of baseline-free damage detection is provided in Chapter 1. Following this discussion, the research questions are formulated. The organization of this document and the major contributions of this research are also listed in this chapter. Chapter 2 describes a fully automated baseline-free technique for notch and crack detection in plates using a collocated pair of piezoelectric wafer transducers for measuring ultrasonic signals. Signal component corresponding to the damage induced mode-converted Lamb waves is extracted by processing the originally measured ultrasonic signals. The damage index is computed as a function of this mode-converted Lamb wave signal component. An over-determined system of Lamb wave measurements is used to find a least-square estimate of the measurement errors. This error estimate serves as the

The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency.

Full Text Available The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency.

Oscillation or instability is a situation that must be avoided for reliable hybrid DC/DC converters. A real-time electronics measurement technique was developed to detect catastrophic oscillations at early stages for hybrid DC/DC converters. It is capable of identifying low-level oscillation and determining the degree of the oscillation at a unique frequency for every individual model of the converters without disturbing their normal operations. This technique is specially developed for space-used hybrid DC/DC converters, but it is also suitable for most of commercial and military switching-mode power supplies. This is a weak-electronic-signal detectiontechnique to detect hybrid DC/DC converter oscillation presented as a specific noise signal at power input pins. It is based on principles of feedback control loop oscillation and RF signal modulations, and is realized by using signal power spectral analysis. On the power spectrum, a channel power amplitude at characteristic frequency (CPcf) and a channel power amplitude at switching frequency (CPsw) are chosen as oscillation level indicators. If the converter is stable, the CPcf is a very small pulse and the CPsw is a larger, clear, single pulse. At early stage of oscillation, the CPcf increases to a certain level and the CPsw shows a small pair of sideband pulses around it. If the converter oscillates, the CPcf reaches to a higher level and the CPsw shows more high-level sideband pulses. A comprehensive stability index (CSI) is adopted as a quantitative measure to accurately assign a degree of stability to a specific DC/DC converter. The CSI is a ratio of normal and abnormal power spectral density, and can be calculated using specified and measured CPcf and CPsw data. The novel and unique feature of this technique is the use of power channel amplitudes at characteristic frequency and switching frequency to evaluate stability and identify oscillations at an early stage without interfering with a DC/DC converter s

The teratogenic and carcinogenic effects of phthalate esters on living beings are proven in toxicology studies. These ubiquitous food and environmental pollutants pose a great danger to the human race due to their extraordinary use as a plasticizer in the consumer product industry. Contemporary detectiontechniques used for phthalates require a high level of skills, expensive equipment and longer analysis time than the presented technique. Presented research work introduces a real time non-invasive detectiontechnique using a new type of silicon substrate based planar interdigital (ID) sensor fabricated on basis of thin film micro-electromechanical system (MEMS) semiconductor device fabrication technology. Electrochemical impedance spectroscopy (EIS) was used in conjunction with the fabricated sensor to detect phthalates in deionized water. Various concentrations of di(2-ethylhexyl) phthalate (DEHP) as low as 2 ppb to a higher level of 2 ppm in deionized water were detected distinctively using new planar ID sensor based EIS sensing system. Dip testing method was used to obtain the conductance and dielectric properties of the bulk samples. Parylene C polymer coating was used as a passivation layer on the surface of the fabricated sensor to reduce the influence of Faradaic currents. In addition, inherent dielectric properties of the coating enhanced the sensitivity of the capacitive type sensor. Electrochemical spectrum analysis algorithm was used to model experimentally observed impedance spectrum to deduce constant phase element (CPE) equivalent circuit to analyse the kinetic processes taking place inside the electrochemical cell. Curve fitting technique was used to extract the values of the circuit components and explain experimental results on theoretical grounds. The sensor performance was tested by adding DEHP to an energy drink at concentrations above and below the minimal risk level (MRL) limit set by the ATSDR (Agency for Toxic Substances & Disease Registry

RDBMS is the heart for both OLTP and OLAP types of applications. For both types of applications thousands of queries expressed in terms of SQL are executed on daily basis. All the commercial DBMS engines capture various attributes in system tables about these executed queries. These queries need to conform to best practices and need to be tuned to ensure optimal performance. While we use checklists, often tools to enforce the same, a black box technique on the queries for profiling, outlier detection is not employed for a summary level understanding. This is the motivation of the paper, as this not only points out to inefficiencies built in the system, but also has the potential to point evolving best practices and inappropriate usage. Certainly this can reduce latency in information flow and optimal utilization of hardware and software capacity. In this paper we start with formulating the problem. We explore four outlier detectiontechniques. We apply these techniques over rich corpora of production queries ...

Cavitation phenomena are known for their destructive capacity in hydraulic machineries and are caused by the pressure decrease followed by an implosion when the cavitation bubbles find an adverse pressure gradient. A helical vortex appears in the turbine diffuser cone at partial flow rate operation and can be cavitating in its core. Cavity volumes and vortex frequencies vary with the under-pressure level. If the vortex frequency comes close to one of the eigen frequencies of the turbine, a resonance phenomenon may occur, the unsteady fluctuations can be amplified and lead to important turbine and hydraulic circuit damage. Conventional cavitation vortex detectiontechniques are based on passive devices (pressure sensors or accelerometers). Limited sensor bandwidths and low frequency response limit the vortex detection and characterization information provided by the passive techniques. In order to go beyond these techniques and develop a new active one that will remove these drawbacks, previous work in the field has shown that techniques based on acoustic signals using adapted signal content to a particular hydraulic situation, can be more robust and accurate. The cavitation vortex effects in the water flow profile downstream hydraulic turbines runner are responsible for signal content modifications. Basic signal techniques use narrow band signals traveling inside the flow from an emitting transducer to a receiving one (active sensors). Emissions of wide band signals in the flow during the apparition and development of the vortex embeds changes in the received signals. Signal processing methods are used to estimate the cavitation apparition and evolution. Tests done in a reduced scale facility showed that due to the increasing flow rate, the signal -- vortex interaction is seen as modifications on the received signal's high order statistics and bandwidth. Wide band acoustic transducers have a higher dynamic range over mechanical elements; the system's reaction time

Faults such as misalignment, rotor cracks and rotor to stator rub can exist collectively in rotor bearing systems. It is an important task for rotor dynamic personnel to monitor and detect faults in rotating machinery. In this paper, the rotor startup vibrations are utilized to solve the fault identification problem using time frequency techniques. Numerical simulations are performed through finite element analysis of the rotor bearing system with individual and collective combinations of faults as mentioned above. Three signal processing tools namely Short Time Fourier Transform (STFT), Continuous Wavelet Transform (CWT) and Hilbert Huang Transform (HHT) are compared to evaluate their detection performance. The effect of addition of Signal to Noise ratio (SNR) on three time frequency techniques is presented. The comparative study is focused towards detecting the least possible level of the fault induced and the computational time consumed. The computation time consumed by HHT is very less when compared to CWT based diagnosis. However, for noisy data CWT is more preferred over HHT. To identify fault characteristics using wavelets a procedure to adjust resolution of the mother wavelet is presented in detail. Experiments are conducted to obtain the run-up data of a rotor bearing setup for diagnosis of shaft misalignment and rotor stator rubbing faults.

This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

Alterations in neurotransmission have been implicated in numerous neurodegenerative and neuropsychiatric disorders, including Alzheimer disease, Parkinson disease, epilepsy, and schizophrenia. Unfortunately, few techniques support the measurement of real-time changes in neurotransmitter levels over multiple days, as is essential for ethologic and pharmacodynamic testing. Microdialysis is commonly used for these research paradigms, but its poor temporal and spatial resolution make this technique inadequate for measuring the rapid dynamics (milliseconds to seconds) of fast signaling neurotransmitters, such as glutamate and acetylcholine. Enzymatic microelectrode arrays (biosensors) coupled with electrochemical recording techniques have demonstrated fast temporal resolution (less than 1 s), excellent spatial resolution (micron-scale), low detection limits (≤200 nM), and minimal damage (50 to 100 μm) to surrounding brain tissue. Here we discuss the benefits, methods, and animal welfare considerations of using platinum microelectrodes on a ceramic substrate for enzyme-based electrochemical recording techniques for real-time in vivo neurotransmitter recordings in both anesthetized and awake, freely moving rodents.

Full Text Available Users and organizations find it continuously challenging to deal with distributed denial of service (DDoS attacks. . The security engineer works to keep a service available at all times by dealing with intruder attacks. The intrusion-detection system (IDS is one of the solutions to detecting and classifying any anomalous behavior. The IDS system should always be updated with the latest intruder attack deterrents to preserve the confidentiality, integrity and availability of the service. In this paper, a new dataset is collected because there were no common data sets that contain modern DDoS attacks in different network layers, such as (SIDDoS, HTTP Flood. This work incorporates three well-known classification techniques: Multilayer Perceptron (MLP, Naïve Bayes and Random Forest. The experimental results show that MLP achieved the highest accuracy rate (98.63%.

The identification of the exact location of the heatings that occur in often inaccessible locations several hundred meters deep in goaf areas is a key to allowing effective control measures to be taken.Radon technique provides the only solution for remotely locating underground sponcom from surface.The results of Chaili mine indicated that a total area of 66 000 m2 has been surveyed and exact locations of spontaneous combustion have been detected.This has enabled the successful implementation of various control measures against spontaneous combustion.

Full Text Available Objective To study the application of DNA microarray technique for screening and identifying multiple food-borne pathogens. Methods The oligonucleotide probes were designed by Clustal X and Oligo 6.0 at the conserved regions of specific genes of multiple food-borne pathogens, and then were validated by bioinformatic analyses. The 5' end of each probe was modified by amino-group and 10 Poly-T, and the optimized probes were synthesized and spotted on aldehyde-coated slides. The bacteria DNA template incubated with Klenow enzyme was amplified by arbitrarily primed PCR, and PCR products incorporated into Aminoallyl-dUTP were coupled with fluorescent dye. After hybridization of the purified PCR products with DNA microarray, the hybridization image and fluorescence intensity analysis was acquired by ScanArray and GenePix Pro 5.1 software. A series of detection conditions such as arbitrarily primed PCR and microarray hybridization were optimized. The specificity of this approach was evaluated by 16 different bacteria DNA, and the sensitivity and reproducibility were verified by 4 food-borne pathogens DNA. The samples of multiple bacteria DNA and simulated water samples of Shigella dysenteriae were detected. Results Nine different food-borne bacteria were successfully discriminated under the same condition. The sensitivity of genomic DNA was 102 －103pg/ μl, and the coefficient of variation (CV of the reproducibility of assay was less than 15%. The corresponding specific hybridization maps of the multiple bacteria DNA samples were obtained, and the detection limit of simulated water sample of Shigella dysenteriae was 3.54×105cfu/ml. Conclusions The DNA microarray detection system based on arbitrarily primed PCR can be employed for effective detection of multiple food-borne pathogens, and this assay may offer a new method for high-throughput platform for detecting bacteria.

Discharge of oil into the sea is one of the most dangerous, among technological hazards, for the maritime environment. In the last years maritime transport and exploitation of marine resources continued to increase; as a result, tanker accidents are nowadays increasingly frequent, continuously menacing the maritime security and safety. Satellite remote sensing could contribute in multiple ways, in particular for what concerns early warning and real-time (or near real-time) monitoring. Several satellite techniques exist, mainly based on the use of SAR (Synthetic Aperture Radar) technology, which are able to recognise, with sufficient accuracy, oil spills discharged into the sea. Unfortunately, such methods cannot be profitably used for real-time detection, because of the low observational frequency assured by present satellite platforms carrying SAR sensors (the mean repetition rate is something like 30 days). On the other hand, potential of optical sensors aboard meteorological satellites, was not yet fully exploited and no reliable techniques have been developed until now for this purpose. Main limit of proposed techniques can be found in the ``fixed threshold'' approach which makes such techniques difficult to implement without operator supervision and, generally, without an independent information on the oil spill presence that could drive the choice of the best threshold. A different methodological approach (RAT, Robust AVHRR Techniques) proposed by Tramutoli (1998) and already successfully applied to several natural and environmental emergencies related to volcanic eruptions, forest fires and seismic activity. In this paper its extension to near real-time detection and monitoring of oil spills by means of NOAA-AVHRR (Advanced Very High Resolution Radiometer) records will be described. Briefly, RAT approach is an automatic change-detection scheme that considers a satellite image as a space-time process, described at each place (x,y) and time t, by the value of

AIM: To evaluate the value of postprocessing techniques of CT colonography, including multiplanar reformation (MPR),virtual colonoscopy (VC), shaded surface display (SSD) and Raysum, in detection of colorectal carcinomas.METHODS: Sixty-four patients with colorectal carcinoma underwent volume scanning with spiral CT. MPR, VC, SSD and Raysum images were obtained by using four kinds of postprocessing techniques in workstation. The results were comparatively analyzed according to circumferential extent,lesion length and pathology pattern of colorectal carcinomas.All diagnoses were proved pathologically and surgically.RESULTS: The accuracy of circumferential extent of colorectal carcinoma determined by MPR, VC, SSD and Raysum was 100.0%, 82.8%, 79.7% and 79.7%,respectively. There was a significant statistical difference between MPR and VC. The consistent rate of lesion length was 89.1%, 76.6%, 95.3% and 100.0%, respectively.There was a statistical difference between VC and SSD.The accuracy of discriminating pathology pattern was 81.3%,92.2%, 71.9% and 71.9%, respectively. There was a statistical difference between VC and SSD. MPR could determine accurately the circumference of colorectal carcinoma, Raysum could determine the length of lesion more precisely than SSD, VC was helpful in discriminating pathology patterns.CONCLUSION: MPR, VC, SSD and Raysum have advantage and disadvantage in detection of colorectal carcinoma, use of these methods in combination can disclose the lesion more accurately.

- Microseismic monitoring systems are generally installed in areas of induced seismicity caused by human activity. Induced seismicity results from changes in the state of stress which may occur as a result of excavation within the rock mass in mining (i.e., rockbursts), and changes in hydrostatic pressures and rock temperatures (e.g., during fluid injection or extraction) in oil exploitation, dam construction or fluid disposal. Microseismic monitoring systems determine event locations and important source parameters such as attenuation, seismic moment, source radius, static stress drop, peak particle velocity and seismic energy. An essential part of the operation of a microseismic monitoring system is the reliable detection of microseismic events. In the absence of reliable, automated picking techniques, operators rely upon manual picking. This is time-consuming, costly and, in the presence of background noise, very prone to error. The techniques described in this paper not only permit the reliable identification of events in cluttered signal environments they have also enabled the authors to develop reliable automated event picking procedures. This opens the way to use microseismic monitoring as a cost-effective production/operations procedure. It has been the experience of the authors that in certain noisy environments, the seismic monitoring system may trigger on and subsequently acquire substantial quantities of erroneous data, due to the high energy content of the ambient noise. Digital filtering techniques need to be applied on the microseismic data so that the ambient noise is removed and event detection simplified. The monitoring of seismic acoustic emissions is a continuous, real-time process and it is desirable to implement digital filters which can also be designed in the time domain and in real-time such as the Kalman Filter. This paper presents a real-time Kalman Filter which removes the statistically describable background noise from the recorded

Objective: To investigate the effect of Quyu Xiaoban Capsule (祛瘀消斑, QYXB) on the regressive treatment of atherosclerosis (AS) with acoustic densitometry (AD) technique. Methods: Eighty patients with AS were randomly divided into two groups, trial group was treated with QYXB and conventional medicine, and control group was treated with conventional medicine alone. Normal arterial wall and different types of atherosclerotic plaques were detected with AD technique before treatment and 10 months later. Resuits: The corrected averages in intimal echo intensity (AIIc%) were elevated in both groups but without significant difference, AIIc% of fatty plaques were increased in both groups and the value after treatment was significantly higher than that of pre-treatment in the trial group (68.12±5.54 vs 61.43±5.37, P＜0.05).The increment rate of AIIc% in trial group was significantly higher than that in control group (10.9±5.1% vs2.5±5.5%, P＜0.05). Conclusion: QYXB can stabilize the atherosclerotic plaque by increasing its acoustic density. Acoustic densitometry technique can differentiate the different histological plaques and monitor the histological changes of plaques during treatment.

Full Text Available Patients with severe chronic obstructive pulmonary disease (COPD often exhale along the same flow–volume curve during quiet breathing as they do during the forced expiratory vital capacity manoeuvre, and this has been taken as an indicator of expiratory flow limitation at rest (EFLT. Therefore, EFLT, namely attainment of maximal expiratory flow during tidal expiration, occurs when an increase in transpulmonary pressure causes no increase in expiratory flow. EFLT leads to small airway injury and promotes dynamic pulmonary hyperinflation, with concurrent dyspnoea and exercise limitation. In fact, EFLT occurs commonly in COPD patients (mainly in Global Initiative for Chronic Obstructive Lung Disease III and IV stage, in whom the latter symptoms are common, but is not exclusive to COPD, since it can also be detected in other pulmonary and nonpulmonary diseases like asthma, acute respiratory distress syndrome, heart failure and obesity, etc. The existing up to date physiological techniques of assessing EFLT are reviewed in the present work. Among the currently available techniques, the negative expiratory pressure has been validated in a wide variety of settings and disorders. Consequently, it should be regarded as a simple, noninvasive, practical and accurate new technique.

Metal foams are interesting materials with many potential applications. They are characterized by a cellular structure, that is the metals or metal alloys foamed include gas voids in the material. Their particular lightweight structure and physical, chemical and mechanical properties make them suitable for a wide range of industrial applications in different sectors. For industrial applications, metal foams offer attractive combinations of low density, high stiffness to weight ratio, good energy absorption and vibration damping capacity that cannot be obtained with other materials. The control of the foaming process and the characterization of the metal foam are important issues in order to obtain a product with good properties and guarantee the quality of a mechanical component. The characterization and control of mechanical components and sandwich panels manufactured with metal foams require the assessment of the defects present in this material, like large pores or imperfections which are responsible of deteriorating the mechanical performance. Therefore, specific methods of non-destructive testing are required, both in the manufacturing process and during the life of the component. In this work, some ultrasonic transmission techniques developed for detection of defects associated with the manufacturing process of aluminum foams are proposed. These techniques were used on plates and structures of different thicknesses and geometries formed with this material. Ultrasonic transmission techniques were carried out both, with low frequency air coupled transducers, and higher frequency transducers, focused and unfocused, by contact and immersion. To validate the results, the ultrasonic images obtained were compared with radiographic images of the foam.

Full Text Available This paper is all about a novel algorithm formulated with k-means clustering performed on remote sensing images. The fields of Remote Sensing are very wide and its techniques and applications are used both in the data acquisition method and data processing procedures. It is also a fast developing field with respect to all the above terms. Remote Sensing plays a very important role in understanding the natural and human processes affecting the earth’s minerals. The k-means clustering technique is used for segmentation or feature selection of passive and active imaging and non-imaging Remote Sensing, on airborne or on satellite platforms, from monochromatic to hyperspectral. So here we concentrate on the images taken on or above the surface of the earth which are applied based on the proposed algorithm to detect the minerals like Giacomo that exist on the surface of the earth. Our experimental results demonstrate that our technique can improve the computational speed of the direct k-means algorithm by an order to two orders of magnitude in the total number of distance calculations and the overall time.

The {open_quotes}PRimed IN Situ labeling{close_quotes} (PRINS) method is an interesting alternative to in situ hybridization for chromosomal detection. In this procedure, chromosome labeling is performed by in situ annealing of specific oligonucleotide primers, followed by primer elongation by a Taq polymerase in the presence of labeled nucleotides. Using this process, we have developed a simple and semi-automatic method for rapid in situ detection of human chromosome 21. The reaction was performed on a programmable temperature cycler, with a chromosome 21 specific oligonucleotide primer. Different samples of normal and trisomic lymphocytes and amniotic fluid cells were used for testing the method. Specific labeling of chromosome 21 was obtained in both metaphases and interphase nuclei in a 1 hour reaction. The use of oligonucleotide primer for in situ labeling overcomes the need for complex preparations of specific DNA probes. The present results demonstrate that PRINS may be a simple and reliable technique for rapidly detecting aneuploidies. 18 refs., 1 fig.

Full Text Available Large scale of watermarking methods is available in the literature. These methods differ in visibility, capacity, and robustness. In watermarking, the robustness against attacks is the most challenging issue. The desynchronization attacks are the most serious problems facing the watermarking process. The traditional correlation methods fail in watermark detection. Until today there is no widely used algorithm for solving the desynchronization attacks. In this paper, we will introduce a new algorithm for solving the watermark desynchronization attacks. The watermark embedding and detection models are introduced. So, these models are related to the attacker model by presenting four attacking scenarios. We show the effect of each attack scenario on bit rate, signal distortion, and robustness. We conclude that, the attacker could not distort a big part of the watermark. So, we suggest using a probabilistic embedding model combined with the longest common substring technique. This combination is efficient in solving the desynchronization attacks. Results show that, the proposed algorithm is powerful against the attacking scenarios. Moreover, the watermark is still be detected even if only 5% of the watermark is recovered.

Full Text Available Non destructive evaluation techniques based on eddy currents (EC are largely used for quality control of the castings in a lot of industries. The principle of detection by EC consists in using an adequate inductive coil to generate them by a variable magnetic field, and measuring their effects by using one or several sensors. These effects result from the interaction between the induced magnetic field and the excited conductive material. A local variation of the physical properties or geometry of the tested sample, due to a singularity or a flaw, causes a modification of the EC distribution, enabling thus detection. In order to optimize the capacity of defect revealing by means of EC based probes, an accurate modelling of the problem is essential. This can be used to perform simulation of the EC distribution under different circumstances and to analyze the EC sensitivity to the various implicated parameters. In this work, the modelling of EC is made by using the finite element method. Using a B-scan strategy was used, detection of a small defect having the shape of an open cavity is shown to be correctly indicated via monitoring variations of the induced voltage in the receiver coil.

Full Text Available This work has been developed in the Applied Physics Department at the University of Vigo within the line of research based on the treatment of the degraded water by pollutants through the use of microalgae, reducing the emissions of greenhouse gases through the absorption of CO2 in the process and the reuse of biomass as biofuel. Remote sensing techniques have contributed to a great extent to the development of oil pollution monitoring systems. However, the available detection methods, mainly designed for spaceborne and airborne long distance inspection, are too expensive and complex to be used in an operational way by relatively unskilled personnel. In the framework of DEOSOM project (European AMPERA project, an innovative water monitoring method was proposed, in two steps: early oil spill detection using a portable shipborne laser-induced fluorescence LIDAR (LIF/LIDAR, and analysis of suspicious water samples in laboratory using the Spectral Fluorescent Signature (SFS technique. This work is focused on the second technique. This system aims to optimize the production of microalgae for biofuel and contaminant cleaning applications and was developed and tested in photo-bioreactors in the University of Vigo within the EnerBioAlgae project (SUDOE. In this project, the SFS technique was used as a diagnostic tool employing the fluorescence analyzer INSTANT-SCREENER M53UVC. The Spectral Fluorescence Signature technique (SFS is based on compounds fluorescence properties. The fluorescence intensity of a sample is measured at different excitation and emission wavelengths to produce a 3-dimensional fluorescence matrix, which can also be presented as a 2-dimensional color image where the color shows the intensity of the fluorescence. These matrices offer qualitative and quantitative information, since they can be useful for the identification of different substances from their characteristic excitation and emission spectra of fluorescence. They also

Full Text Available Plagiarism relates to the act of taking information or ideas of someone else and demand it as your own. Basically it reproduce the existing information in modified format. In every field of education it becomes a serious issue. Various techniques and tools are derived these days to detect plagiarism. Various types of plagiarism are there like text matching copy paste grammar based method etc.This paper proposes a new method implemented in a program where we utilise a text set to identify the copied part by comparing with some existing multiple files. Here we put the concept of a machine learning language i.e k-NN. It helps us to identify whether a paper is plagiarized or not.

In clinical practice, Atrial Fibrillation (AF) is the most common and critical cardiac arrhythmia encountered. The treatment that can ensure permanent AF removal is catheter ablation, where cardiologists destroy the affected cardiac muscle cells with RF or Laser. In this procedure it is necessary to know exactly from which part of the heart AF triggers are originated. Various signal processing algorithms provide a strong tool to track AF sources. This study proposes, signal processing techniques that can be exploited for characterization, analysis and source detection of AF signals. These algorithms are implemented on Electrocardiogram (ECG) and intracardiac signals which contain important information that allows the analysis of anatomic and physiologic aspects of the whole cardiac muscle.

Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

This paper describes an innovative use of photogrammetric detectiontechniques to experimentally estimate structural/inertial properties of helicopter rotor blades. The identification algorithms for the evaluation of mass and flexural stiffness distributions are an extension of the ones proposed by Larsen, whereas the procedure for torsional properties determination (stiffness and shear center position) is based on the Euler-Prandtl beam theory. These algorithms rely on measurements performed through photogrammetric detection, which requires the collection of digital photos allowing the identification of 3D coordinates of labeled points (markers) on the structure through the correlation of 2D pictures. The displacements are evaluated by comparing the positions of markers in loaded and reference configuration. Being the applied loads known, the structural characteristics can be directly obtained from the measured displacements. The accuracy of the proposed identification algorithms has been firstly verified by comparison with numerical and experimental data, and then applied to the structural characterization of two main rotor blades, designed for ultra-light helicopter applications.

In recent years, ultrasonic guided waves gained attention for reliable testing and characterization of metals and composites. Guided wave modes are excited and detected by PZT (Lead Zirconate Titanate) transducers either in transmission or reflection mode. In this study guided waves are excited and detected in the transmission mode and the phase change of the propagating wave modes are recorded. In most of the other studies reported in the literature, the change in the received signal strength (amplitude) is investigated with varying degrees of damage while in this study the change in phase is correlated with the extent of damage. Feature extraction techniques are used for extracting phase and time-frequency information. The main advantage of this approach is that the bonding condition between the transducer and the specimen does not affect the phase while it can affect the strength of recorded signal. Therefore, if the specimen is not damaged but the transducer-specimen bonding is deteriorated then the received signal strength is altered but the phase remains same and thus false positive predictions for damage can be avoided.

Full Text Available Glaucoma is a generic name for a group of diseases which causes progressive optic neuropathy and vision loss due to degeneration of the optic nerves. Optic nerve cells act as transducer and convert light signal entered into the eye to electrical signal for visual processing in the brain. The main risk factors of glaucoma are elevated intraocular pressure exerted by aqueous humour, family history of glaucoma (hereditary and diabetes. It causes damages to the eye, whether intraocular pressure is high, normal or below normal. It causes the peripheral vision loss. There are different types of glaucoma. Some glaucoma occurs suddenly. So, detection of glaucoma is essential for minimizing the vision loss. Increased cup area to disc area ratio is the significant change during glaucoma. Diagnosis of glaucoma is based on measurement of intraocular pressure by tonometry, visual field examination by perimetry and measurement of cup area to disc area ratio from the color fundus images. In this paper the different signal processing techniques are discussed for detection and classification of glaucoma.

Colonic polyps appear like elliptical protrusions on the inner wall of the colon. Curvature based features for colonic polyp detection have proved to be successful in several computer-aided diagnostic CT colonography (CTC) systems. Some simple thresholds are set for those features for creating initial polyp candidates, sophisticated classification scheme are then applied on these polyp candidates to reduce false positives. There are two objective functions, the number of missed polyps and false positive rate, that need to be minimized when setting those thresholds. These two objectives conflict and it is usually difficult to optimize them both by a gradient search. In this paper, we utilized a multiobjective evolutionary method, the Strength Pareto Evolutionary Algorithm (SPEA2), to optimize those thresholds. SPEA2 incorporates the concept of Pareto dominance and applies genetic techniques to evolve individual solutions to the Pareto front. The SPEA2 algorithm was applied to colon CT images from 27 patients each having a prone and a supine scan. There are 40 colonoscopically confirmed polyps resulting in 72 positive detections in CTC reading. The results obtained by SPEA2 were compared with those obtained by our old system, where an appropriate value was set for each of those thresholds by a histogram examination method. If we keep the sensitivity the same as that of our old system, the SPEA2 algorithm reduced false positive rate by 76.4% from average false positive 55.6 to 13.3 per data set. If the false positive rate is kept the same for both systems, SPEA2 increased the sensitivity by 13.1% from 53 to 61 among 72 ground truth detections.

Active infrared thermography methods have been known to possess good fault detection capabilities for the detection of defects in materials compared to the conventional passive thermal infrared imaging techniques. However, the reliability of the technique has been under scrutiny. This paper proposes the lock-in thermography technique for the detection and estimation of artificial subsurface defect size and depth with uncertainty measurement.

Irradiation of chestnuts has recently been considered as an alternative treatment to fumigation to reduce the considerable amount of the product normally lost during post-harvest period. The treatment is allowed in countries such as Korea and, in view of a possible extension to European countries, to permit the legal controls as required by the directive 1999/2/EC [ European Parliament and Council Directive, 1999/2/EC, on the approximation of the laws of the Member States concerning foods and food ingredients treated with ionising radiation. Official Journal of the European Communities. L 66/16 of 13.3.1999] and meet consumer consensus, reliable methods for detecting irradiated chestnuts have to be proposed. The aim of the present work was to test the efficacy of the European Standard EN 13751, EN 1788, EN 1787 and EN 13708 in detecting irradiated chestnuts. For this purpose, six sets of "Montella" chestnuts, a typical Italian variety recognized as a PGI (protected geographical indication), non-irradiated and irradiated at different doses in the 0.1-1 kGy range, were analysed by thermoluminescence (TL), photo-stimulated luminescence (PSL) (screening and calibrated PSL) and ESR techniques. PSL and TL analysis results revealed the low luminescence sensitivity of the chestnuts. Nevertheless, PSL screening data were in the intermediate band above the negative threshold (at all doses except at the lowest one) and TL analysis led to correct positive classifications even at the lowest dose tested (0.15 Gy). On the contrary, no radio-induced ESR signal could be registered with the irradiated samples of chestnut shell or pulp.

Irradiation of chestnuts has recently been considered as an alternative treatment to fumigation to reduce the considerable amount of the product normally lost during post-harvest period. The treatment is allowed in countries such as Korea and, in view of a possible extension to European countries, to permit the legal controls as required by the directive 1999/2/EC [/2/EC, on the approximation of the laws of the Member States concerning foods and food ingredients treated with ionising radiation. Official Journal of the European Communities. L 66/16 of 13.3.1999] and meet consumer consensus, reliable methods for detecting irradiated chestnuts have to be proposed. The aim of the present work was to test the efficacy of the European Standard EN 13751, EN 1788, EN 1787 and EN 13708 in detecting irradiated chestnuts. For this purpose, six sets of 'Montella' chestnuts, a typical Italian variety recognized as a PGI (protected geographical indication), non-irradiated and irradiated at different doses in the 0.1-1 kGy range, were analysed by thermoluminescence (TL), photo-stimulated luminescence (PSL) (screening and calibrated PSL) and ESR techniques. PSL and TL analysis results revealed the low luminescence sensitivity of the chestnuts. Nevertheless, PSL screening data were in the intermediate band above the negative threshold (at all doses except at the lowest one) and TL analysis led to correct positive classifications even at the lowest dose tested (0.15 Gy). On the contrary, no radio-induced ESR signal could be registered with the irradiated samples of chestnut shell or pulp.

Although angle random walk (ARW) of fiber optic gyroscope (FOG) has been well modeled and identified before being integrated into the high-accuracy attitude control system of satellite, aging and unexpected failures can affect the performance of FOG after launch, resulting in the variation of ARW coefficient. Therefore, the ARW coefficient can be regarded as an indicator of "state of health" for FOG diagnosis in some sense. The Allan variance method can be used to estimate ARW coefficient of FOG, however, it requires a large amount of data to be stored. Moreover, the procedure of drawing slope lines for estimation is painful. To overcome the barriers, a weighted state-space model that directly models the ARW to obtain a nonlinear state-space model was established for FOG. Then, a neural extended-Kalman filter algorithm was implemented to estimate and track the variation of ARW in real time. The results of experiment show that the proposed approach is valid to detect the state of FOG. Moreover, the proposed technique effectively avoids the storage of data.

This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

Although angle random walk (ARW) of fiber optic gyroscope (FOG) has been well modeled and identified before being integrated into the high-accuracy attitude control system of satellite, aging and unexpected failures can affect the performance of FOG after launch, resulting in the variation of ARW coefficient. Therefore, the ARW coefficient can be regarded as an indicator of “state of health” for FOG diagnosis in some sense. The Allan variance method can be used to estimate ARW coefficient of FOG, however, it requires a large amount of data to be stored. Moreover, the procedure of drawing slope lines for estimation is painful. To overcome the barriers, a weighted state-space model that directly models the ARW to obtain a nonlinear state-space model was established for FOG. Then, a neural extended-Kalman filter algorithm was implemented to estimate and track the variation of ARW in real time. The results of experiment show that the proposed approach is valid to detect the state of FOG. Moreover, the proposed technique effectively avoids the storage of data.

In this paper we investigated the different detectiontechniques especially direct detection, coherent heterodyne detection and coherent homodyne detection on FMCW LIDAR system using Optisystem package. A model for target, propagation channel and various detectiontechniques were developed using Optisystem package and then a comparative study among various detectiontechniques for FMCW LIDAR systems is done analytically and simulated using the developed model. Performance of direct detection, heterodyne detection and homodyne detection for FMCW LIDAR system was calculated and simulated using Optisystem package. The output simulated performance was checked using simulated results of MATLAB simulator. The results shows that direct detection is sensitive to the intensity of the received electromagnetic signal and has low complexity system advantage over the others detection architectures at the expense of the thermal noise is the dominant noise source and the sensitivity is relatively poor. In addition to much higher detection sensitivity can be achieved using coherent optical mixing which is performed by heterodyne and homodyne detection.

Hyperspectral remote sensing is an emerging, multidisciplinary field with diverse applications in Earth observation. Nowadays spectral remote sensing techniques allow presymptomatic monitoring of changes in the physiological state of plants with high spectral resolution. Hyperspectral leaf reflectance and chlorophyll fluorescence proved to be highly suitable for identification of growth anomalies of cultural plants that result from the environmental changes and different stress factors. Hyperspectral technologies can find place in many scientific areas, as well as for monitoring of plants status and functioning to help in making timely management decisions. This research aimed to detect a presence of viral infection in young pepper plants (Capsicum annuum L.) caused by Cucumber Mosaic Virus (CMV) by using hyperspectral reflectance and fluorescence data and to assess the effect of some growth regulators on the development of the disease. In Bulgaria CMV is one of the widest spread pathogens, causing the biggest economical losses in crop vegetable production. Leaf spectral reflectance and fluorescence data were collected by a portable fibre-optics spectrometer in the spectral ranges 450÷850 nm and 600-900 nm. Greenhouse experiment with pepper plants of two cultivars, Sivria (sensitive to CMV) and Ostrion (resistant to CMV) were used. The plants were divided into six groups. The first group consisted of healthy (control) plants. At growth stage 4-6 expanded leaf, the second group was inoculated with CMV. The other four groups were treated with growth regulators: Spermine, MEIA (beta-monomethyl ester of itaconic acid), ВТН (benzo(1,2,3)thiadiazole-7-carbothioic acid-S-methyl ester) and Phytoxin. On the next day, the pepper plants of these four groups were inoculated with CMV. The viral concentrations in the plants were determined by the serological method DAS-ELISA. Statistical, first derivative and cluster analysis were applied and several vegetation indices were

Full Text Available Region duplication forgery detection is a special type of forgery detection approach and widely used research topic under digital image forensics. In copy move forgery, a specific area is copied and then pasted into any other region of the image. Due to the availability of sophisticated image processing tools, it becomes very hard to detect forgery with naked eyes. From the forged region of an image no visual clues are often detected. For making the tampering more robust, various transformations like scaling, rotation, illumination changes, JPEG compression, noise addition, gamma correction, and blurring are applied. So there is a need for a method which performs efficiently in the presence of all such attacks. This paper presents a detection method based on speeded up robust features (SURF and hierarchical agglomerative clustering (HAC. SURF detects the keypoints and their corresponding features. From these sets of keypoints, grouping is performed on the matched keypoints by HAC that shows copied and pasted regions.

Underwater target detection is investigated by combining active polarization imaging and optical correlation-based approaches. Experiments were conducted in a glass tank filled with tap water with diluted milk or seawater and containing targets of arbitrary polarimetric responses. We found that target estimation obtained by imaging with two orthogonal polarization states always improves detection performances when correlation is used as detection criterion. This experimental study illustrates the potential of polarization imaging for underwater target detection and opens interesting perspectives for the development of underwater imaging systems.

Wavelength modulation technique(WMT) and active intracavity technique(ACIT) are first introduced in this paper,which are used to realize the concentration detection of methane and acetylene respectively.When ACIT is combined with wavelength sweep technique(WST),the detection sensitivity of acetylene can be enhanced sharply.When ACIT is combined with WST and WMT,the detection sensitivity of acetylene can be enhanced further.

A unique and customizable machinevision and image-data-processing technique has been developed for use in automated identification of cells that are optimal for patch clamping. [Patch clamping (in which patch electrodes are pressed against cell membranes) is an electrophysiological technique widely applied for the study of ion channels, and of membrane proteins that regulate the flow of ions across the membranes. Patch clamping is used in many biological research fields such as neurobiology, pharmacology, and molecular biology.] While there exist several hardware techniques for automated patch clamping of cells, very few of those techniques incorporate machine vision for locating cells that are ideal subjects for patch clamping. In contrast, the present technique is embodied in a machine-vision algorithm that, in practical application, enables the user to identify good and bad cells for patch clamping in an image captured by a charge-coupled-device (CCD) camera attached to a microscope, within a processing time of one second. Hence, the present technique can save time, thereby increasing efficiency and reducing cost. The present technique involves the utilization of cell-feature metrics to accurately make decisions on the degree to which individual cells are "good" or "bad" candidates for patch clamping. These metrics include position coordinates (x,y) in the image plane, major-axis length, minor-axis length, area, elongation, roundness, smoothness, angle of orientation, and degree of inclusion in the field of view. The present technique does not require any special hardware beyond commercially available, off-the-shelf patch-clamping hardware: A standard patchclamping microscope system with an attached CCD camera, a personal computer with an imagedata- processing board, and some experience in utilizing imagedata- processing software are all that are needed. A cell image is first captured by the microscope CCD camera and image-data-processing board, then the image

Full Text Available Phishing is a criminal scheme to steal the user‟s personal data and other credential information. It is a fraud that acquires victim‟s confidential information such as password, bank account detail, credit card number, financial username and password etc. and later it can be misuse by attacker. In this research paper, we proposed a new anti-phishing algorithm, which we call ObURL Detection Algorithm. The ObURL Detection Algorithm used to detect the URL Obfuscation Phishing attacks and it provides the multilayer security over the internet fraud. The ObURL Detection Algorithm can detect the hyperlink, content of hyperlink‟s destination URL, iFrame in email, input form, input form in iFrame source URL, iFrame within iFrame, and after all that multiple tests will be perform such as DNS Test, IP address Test, URL Encode Test, Shorten URL Test, Black and Whitelist Test, URL pattern matching Test On that collected data. Our experiments verified that ObURL Detection Algorithm is effective to detect both known and unknown URL Obfuscation phishing attacks.

We describe a continuously operating scanning X-ray imaging system developed for landmine detection based on a backscatter X-ray principle, thus detection is done from the same side as the source. The source operates at 120 kV p and 3 mA. To study the physics of Compton X-ray backscattering, the photon transport factor, backscatter factor (BSF) and backscatter probability (BSP) were simulated using Monte-Carlo calculations using the generalized particle transport program MCNP. Based on the Monte-Carlo analyses results, a mine detecting system has been designed. It potentially has a low false alarm rate and a high detection probability, and a direct imaging facility.

This paper presents a tumor detection algorithm from mammogram. The proposed system focuses on the solution of two problems. One is how to detect tumors as suspicious regions with a very weak contrast to their background and another is how to extract features which categorize tumors. The tumor detection method follows the scheme of (a) mammogram enhancement. (b) The segmentation of the tumor area. (c) The extraction of features from the segmented tumor area. (d) The use of SVM classifier. The enhancement can be defined as conversion of the image quality to a better and more understandable level. The mammogram enhancement procedure includes filtering, top hat operation, DWT. Then the contrast stretching is used to increase the contrast of the image. The segmentation of mammogram images has been playing an important role to improve the detection and diagnosis of breast cancer. The most common segmentation method used is thresholding. The features are extracted from the segmented breast area. Next stage include,...

National Aeronautics and Space Administration — We propose to develop a new and innovative method for the detection and classification of emboli flowing into the brain through Carotid arteries, specifically for...

Objective To establish a speed and effective method to detect rotavirus. Methods Using ELISA and one step RT-PCR to detect 196 clinic samples from Xi'an area. Results Compared with ELISA method, one step RT PCR was more sensitive and specific (P ＜0.05). Conclusion One step RT-PCR is a simple, speed, sensitive and spe cific method for clinic and epidemic studies of rotavirus.

Detection of cracks from stainless steel pipe images is done using contrast stretching technique. The technique is based on an image filter technique through mathematical morphology that can expose the cracks. The cracks are highlighted and noise removal is done efficiently while still retaining the edges. An automated crack detection system with a camera platform has been successfully implemented. We compare crack extraction in terms of quality measures with those of Otsu's threshold technique and the another technique (Iyer and Sinha, 2005). The algorithm shown is able to achieve good results and perform better than these other techniques.

Full Text Available Machine Learning approaches are good in solving problems that have less information. In most cases, the software domain problems characterize as a process of learning that depend on the various circumstances and changes accordingly. A predictive model is constructed by using machine learning approaches and classified them into defective and non-defective modules. Machine learning techniques help developers to retrieve useful information after the classification and enable them to analyse data from different perspectives. Machine learning techniques are proven to be useful in terms of software bug prediction. This study used public available data sets of software modules and provides comparative performance analysis of different machine learning techniques for software bug prediction. Results showed most of the machine learning methods performed well on software bug datasets.

On megaparsec scales the Universe is permeated by an intricate filigree of clusters, filaments, sheets and voids, the Cosmic Web. For the understanding of its dynamical and hierarchical history it is crucial to identify objectively its complex morphological components. One of the most characteristic aspects is that of the dominant underdense Voids, the product of a hierarchical process driven by the collapse of minor voids in addition to the merging of large ones. In this study we present an objective void finder technique which involves a minimum of assumptions about the scale, structure and shape of voids. Our void finding method, the Watershed Void Finder (WVF), is based upon the Watershed Transform, a well-known technique for the segmentation of images. Importantly, the technique has the potential to trace the existing manifestations of a void hierarchy. The basic watershed transform is augmented by a variety of correction procedures to remove spurious structure resulting from sampling noise. This study c...

technique is proposed to solve this problem. An average rate of voltage change (passive technique) has been used to initiate a real power shift (active technique), which changes the eal power of distributed generation (DG), when the passive technique cannot have a clear discrimination between islanding......The mainly used islanding detectiontechniques may be classified as active and passive techniques. Passive techniques don't perturb the system but they have larger nondetection znes, whereas active techniques have smaller nondetection zones but they perturb the system. In this paper, a new hybrid...

In large-scale biometric authentication systems such as the US-Visit (USA), a 10-fingerprints scanner which simultaneously captures four fingerprints is used. In traditional systems, specific hand-types (left or right) are indicated, but it is difficult to detect hand-type due to the hand rotation and the opening and closing of fingers. In this paper, we evaluated features that were extracted from hand images (which were captured by a general optical scanner) that are considered to be effective for detecting hand-type. Furthermore, we extended the knowledge to real fingerprint images, and evaluated the accuracy with which it detects hand-type. We obtained an accuracy of about 80% with only three fingers (index, middle, ring finger).

Full Text Available Airbags are subjected to strict quality control in order to ensure passengers safety. The quality of fabric and sewing thread influences the final product and therefore, sewing defects must be early and accurately detected, in order to remove the item from production. Airbag seams assembly can take various forms, using linear and circle primitives, with threads of different colors and length densities, creating lockstitch or double threads chainstitch. The paper presents a framework for the automatic detection of defects occurring during the airbag sewing stage. Types of defects as skipped stitch, missed stitch, or superimposed seam for lockstitch and two threads chainstitch are detected and marked. Using image processing methods, the proposed framework follows the seams path and determines if a color pattern of the considered stitches is valid.

A neutron diagnostic experimental apparatus has been tested for nondestructive verification of sealed munitions. Designed to potentially satisfy a significant number of van-mobile requirements, this equipment is based on an easy to use industrial sealed tube neutron generator that interrogates the munitions of interest with 14 MeV neutrons. Gamma ray spectra are detected with a high purity germanium detector, especially shielded from neutrons and gamma ray background. A mobile shell holder has been used. Possible configurations allow the detection, in continuous or in pulsed modes, of gamma rays from neutron inelastic scattering, from thermal neutron capture, and from fast or thermal neutron activation. Tests on full scale sealed munitions with chemical simulants show that those with chlorine (old generation materials) are detectable in a few minutes, and those including phosphorus (new generation materials) in nearly the same time.

X-ray mammography is the only imaging method currently available with any proven efficacy for screening to detect early-stage, clinically occult breast cancer. Sonography has a limited role in the differentiation of cystic from solid masses and as a guide for aspiration and preoperative localization of selected breast lesions. Computed tomography has a more limited role to determine the spatial orientation of a lesion detected only in the lateral mammographic position. All other imaging methods should be considered experimental at this time.

Weakly Interacting Massive Particles (WIMPs), are a leading candidate for the dark matter that is observed to constitute ~25% of the total mass-energy density of the Universe. The direct detection of relic WIMPs (those produced during the early moments of the Universe's expansion) is at the forefront of active research areas in particle astrophysics with a numerous international experimental collaborations pursuing this goal. This paper presents an overview of the theoretical and practical considerations common to the design and operation of direct detection experiments, as well as their unique features and capabilities.

In spite of the growing omnipresence of surveillance cameras, not much is known by the general public about their background. While many disciplines have scrutinised the techniques and effects of surveillance, the object itself remains somewhat of a mystery. A design typology of surveillance cameras

Recently, unstable trinucleotide repeats have been shown to be the etiologic factor in several neuropsychiatric diseases, and they may play a similar role in other disorders. To our knowledge, a method that detects expanded trinucleotide sequences with the opportunity for direct localization and cloning has not been achieved. We have developed a set of hybridization-based methods for direct detection of unstable DNA expansion. Our analysis of myotonic dystrophy patients that possess different degrees of (CTG){sub n} expansion, versus unaffected controls, has demonstrated the identification of the trinucleotide instability site without any prior information regarding genetic map location. High stringency modified Southern blot hybridization with a PCR-generated trinucleotide repeat probe allowed us to detect the DNA fragment containing the expansion in myotonic dystrophy patients. The same probe was used for fluorescent in situ hybridization and several regions of (CTG){sub n}/(CAG){sub n} repeats in the human genome were detected, including the myotonic dystrophy locus on chromosome 19q. These strategies can be applied to directly clone genes involved in disorders caused by unstable DNA.

Signal detection is a challenging task for regulatory and intelligence agencies. Subject matter experts in those agencies analyze documents, generally containing narrative text in a time bound manner for signals by identification, evaluation and confirmation, leading to follow-up action e.g., recalling a defective product or public advisory for…

Recently, wireless sensor networks providing fine-grained spatio-temporal observations have become one of the major monitoring platforms for geo-applications. Along side data acquisition, outlier detection is essential in geosensor networks to ensure data quality, secure monitoring and re- liable de

We describe the design, and development of an infrared detection system which detects the onset of carbonization of fluoropolymers in the presence of up to 60 watts of 1.06 micrometer laser energy. This system is used to shut down a therapeutic laser system before significant damage is done to a laser delivery device and patient. Black body radiation emitting from the diffusion tip is transmitted, backwards, through the same optical fiber as the therapeutic wavelength. Using a high power 1.06 micrometer laser mirror at 45 degrees, most of the 1.06 micrometer light is reflected while the black body radiation is passed to a holographic notch filter which further filters the signal. Still more filtering was needed before the 1.1 to 2 micrometer signal could be detected within the presence the therapeutic light using an extended indium gallium arsenide photodetector. There was still a significant detected offset which increased with laser power which necessitated a means to automatically null the offset for different laser power settings. The system is designed to be used with any unmodified laser system. It interfaces directly to or in series with most common external safety interlocks and can be used with various diffusing tips, probes or bare fibers.

Alterations in attention are known to modify excitability of underlying cortical structures and thus the activity recorded during non-invasive electroencephalography (EEG). Brain-Computer-Interface systems for neuromodulation are based on reliable detection of intended movements from continuous EEG...

With the development of human computer interaction, lip reading technology has become a topic focus in the multimode technologic field. However, detecting and locating lip accurately are very difficult because lip contours of different people, varied illuminant conditions, head movements and other factors. Based on the methods of detecting and locating lip we proposed the methods which are based on the lips color extracted lip contour using the adaptive chromatic filter from the facial images. It is not sensitive to illumination, but appropriate chromatic lip filter is given by analyzing the entire face color and clustering statistics of lip color. It is proposed the combinable method which is preprocessing the face image including rotating the angle of face and improving image contrast in this paper and the lip region is analyzed clustering characteristics for the skin color and lip color, obtained adaptive chromatic filter which can prominent lips from the facial image. This method overcomes the varied illuminate, incline face. The experiments showed that it enhanced detection and location accurately through rough detecting lip region. It lay a good foundation for extraction the lip feature and tracking lip subsequently.

Signal detection is a challenging task for regulatory and intelligence agencies. Subject matter experts in those agencies analyze documents, generally containing narrative text in a time bound manner for signals by identification, evaluation and confirmation, leading to follow-up action e.g., recalling a defective product or public advisory for…

Recent years have seen a significant increase in borehole microseismic data acquisition programs associated with unconventional reservoir developments such as hydraulic fracturing programs for shale oil and gas. The data so acquired is used for hydraulic fracture monitoring and diagnostics and therefore, the quality of the data in terms of resolution and accuracy has a significant impact on its value to the industry. Borehole microseismic data acquired in such environments typically suffer from propagation effects due to the presence of thin interbedded shale layers as well as noise and interference effects. Moreover, acquisition geometry has significant impact on detectability across portions of the sensor array. Our work focuses on developing robust first arrival detection and pick selection workflow for both P and S waves specifically designed for such environments. We introduce a novel workflow for refinement of picks with immunity towards significant noise artifacts and applicability over data with very low signal-to-noise ratio provided some accurate picks have already been made. This workflow utilizes multi-step hybrid detection and classification routine which makes use of a neural network based autopicker for initial picking and an evolutionary algorithm for pick refinement. We highlight the results from an actual field case study including multiple examples demonstrating immunity towards noise and compare the effectiveness of the workflow with two contemporary autopicking routines without the application of the shared detection/refinement procedure. Finally, we use a windowed waveform cross-correlation based uncertainty estimation method for potential quality control purposes. While the workflow was developed to work with the neural network based autopicker, it can be used with any other traditional autopicker and provides significant improvements in pick detection across seismic gathers.

Biological microarrays with different proteins and different protein concentrations are detected without external labeling by an oblique-incidence reflectivity difference (OIRD) technique. The initial experiment results reveal that the intensities of OIRD signals can distinguish the different proteins and concentrations of protein. The OIRD technique promises feasible applications to life sciences for label-free and high-throughput detection.

Monocyte specific antigens are relevant in renal and bone marrow transplantation, but a reproducible monocyte-antigen system has not yet been recognized. In order to establish a sensitive test system with reproducible results in monocyte serology, 3 different monocyte cytotoxicity techniques were compared. In our hands the two-colour fluorescence test on post-Ficoll total leukocyte suspensions fulfilled the criteria. This technique was used to screen sera from multiparous women and renal transplant recipients for the presence of monocyte specific antibodies. By testing sera on cells from individuals who were HLA compatible with the serum donors, anti-HLA reactions were excluded. Several promising sera containing monocyte specific antibodies were identified, thus indicating the success of our approach.

Full Text Available Image segmentation commonly known as partitioning of an image is one of the intrinsic parts of any image processing technique. In this image processing step, the digital image of choice is segregated into sets of pixels on the basis of some predefined and preselected measures or standards. There have been presented many algorithms for segmenting a digital image. This paper presents a general review of algorithms that have been presented for the purpose of image segmentation.

Positron annihilation spectroscopy (PAS) is one of the nuclear techniques used in material science. The present measurements are used to study the behavior of defect concentration in one of the most important materials aluminum alloys which is the 7075 alloy. It has been shown that positrons can become trapped at imperfect locations in solids and their mean lifetime can be influenced by changes in the concentration of such defects. No changes have been observed in the mean lifetime values aft...

Full Text Available Bladder cancer is the fourth most common malignant neoplasm in men and the eighth in women. Bladder pathology is usually investigated visually by cystoscopy. In this technique, biopsies are obtained from the suspected area and then, after needed procedure, the diagnostic information can be taken. This is a relatively difficult procedure and is associated with discomfort for the patient and morbidity. Therefore, the electrical impedance spectroscopy (EIS, a minimally invasive screening technique, can be used to separate malignant areas from nonmalignant areas in the urinary bladder. The feasibility of adapting this technique to screen for bladder cancer and abnormalities during cystoscopy has been explored and compared with histopathological evaluation of urinary bladder lesions. Ex vivo studies were carried out in this study by using a total of 30 measured points from malignant and 100 measured points from non-malignant areas of patients bladders in terms of their biopsy reports matching to the electrical impedance measurements. In all measurements, the impedivity of malignant area of bladder tissue was significantly higher than the impedivity of non-malignant area this tissue (<0.005.

Bladder cancer is the fourth most common malignant neoplasm in men and the eighth in women. Bladder pathology is usually investigated visually by cystoscopy. In this technique, biopsies are obtained from the suspected area and then, after needed procedure, the diagnostic information can be taken. This is a relatively difficult procedure and is associated with discomfort for the patient and morbidity. Therefore, the electrical impedance spectroscopy (EIS), a minimally invasive screening technique, can be used to separate malignant areas from nonmalignant areas in the urinary bladder. The feasibility of adapting this technique to screen for bladder cancer and abnormalities during cystoscopy has been explored and compared with histopathological evaluation of urinary bladder lesions. Ex vivo studies were carried out in this study by using a total of 30 measured points from malignant and 100 measured points from non-malignant areas of patients bladders in terms of their biopsy reports matching to the electrical impedance measurements. In all measurements, the impedivity of malignant area of bladder tissue was significantly higher than the impedivity of non-malignant area this tissue (P < 0.005).

Phishing is the combination of social engineering and technical exploits designed to convince a victim to provide personal information, usually for the monetary gain of the attacker. Phishing has become the most popular practice among the criminals of the Web. Phishing attacks are becoming more frequent and sophisticated. The impact of phishing is drastic and significant since it can involve the risk of identity theft and financial losses. Phishing scams have become a problem for online banking and e-commerce users. In this paper we propose a novel approach to detect phishing attacks. We implemented a prototype web browser which can be used as an agent and processes each arriving email for phishing attacks. Using email data collected over a period time we demonstrate data that our approach is able to detect more phishing attacks than existing schemes.

A brief literature review on immuno-assay of yeast cell wall antigens is given. Special attention is paid to extracellular, thermostable yeast antigens (EPS), which are released to the growth medium by many yeast species. The EPS of Saccharomyces cerevisiae and of Stephanoascus ciferrii (syn. Candida ciferrii) could be specifically and sensitively detected by a sandwich ELISA, using an IgG raised in rabbits immunized with the EPS of these yeasts. The EPS ELISA of three basidiomycetous yeasts tested was not specific, that of Geotrichum candidum was genus-specific but was not sensitive. The EPS of Zygosaccharomyces bailii could be detected in a highly specific competitive ELISA but not in a sandwich ELISA or in a latex agglutination test.

Networking technologies are exponentially increasing to meet worldwide communication requirements. The rapid growth of network technologies and perversity of communications pose serious security issues. In this paper, we aim to developing an integrated network defense system with situation awareness capabilities to present the useful information for human analysts. In particular, we implement a prototypical system that includes both the distributed passive and active network sensors and traffic visualization features, such as 1D, 2D and 3D based network traffic displays. To effectively detect attacks, we also implement algorithms to transform real-world data of IP addresses into images and study the pattern of attacks and use both the discrete wavelet transform (DWT) based scheme and the statistical based scheme to detect attacks. Through an extensive simulation study, our data validate the effectiveness of our implemented defense system.

As a platform-independent software system, a Web service is designed to offer interoperability among diverse and heterogeneous applications.With the introduction of service composition in the Web service creation, various message interactions among the atomic services result in a problem resembling the feature interaction problem in the telecommunication area.This article defines the problem as feature interaction in Web services and proposes a model checking-based detection method.In the method, the Web service description is translated to the Promela language - the input language of the model checker simple promela interpreter (SPIN), and the specific properties, expressed as linear temporal logic (LTL) formulas, are formulated according to our classification of feature interaction.Then, SPIN is used to check these specific properties to detect the feature interaction in Web services.

In this paper, a fast and robust voltage sag detection algorithm, named VPS2D, is introduced. Using the DSOGI, the algorithm creates a virtual positive sequence voltage and monitories the fundamental voltage component of each phase. After calculating the aggregate value in the o:;3-reference frame......, the algorithm can rapidly identify the starting and the ending of symmetric and asymmetric voltage sags, even if there are harmonics on the grid. Simulation and experimental results are given to validate the proposed algorithm....

With an increase in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection has become an emerging topics of great importance for academics, research and industries. Financial fraud is a deliberate act that is contrary to law, rule or policy with intent to obtain unauthorized financial benefit and intentional misstatements or omission of amounts by deceiving users of financial statements, especially investors and creditors. Data mining tec...

Systematic measurements were carried out on the possible use of elastically backscattered Pu-Be neutrons combined with the thermal neutron reflection method for the identification of land mines and illicit drugs via he detection of H, C, N, and O elements as their major constituents. While ur present results show that these methods are capable of indicating the anomalies in bulky materials and observation of the major elements, e termination of the exact atom fractions needs further investigation.

The paper summarizes the growth regularity and the related factors of collapse column in Duongpang Coal Mine, introduces the applicability of roadway explorations, drillings and geophysical prospecting methods, expounds how to select an economic and quick exploration method according to the characteristics of each method and difference geological conditions for detecting the place, shape, size and water-bearing property of collapse column. 4 figs.

this paper, we focus on cases where the plume is large (relative to the image ), and provide a method for handling this scenario. The method we develop...the locations of the events, the operation in (11) is a convolution of a binary image with a filter function h. To get an estimate of the probability...background statistics, including the mean and covariance. Diffuse plumes with a large spatial extent are particularly difficult to detect in single- image

Down syndrome is the most commonly occurring chromosomal condition; one in every 691 babies in United States is born with it. Patients with Down syndrome have an increased risk for heart defects, respiratory and hearing problems and the early detection of the syndrome is fundamental for managing the disease. Clinically, facial appearance is an important indicator in diagnosing Down syndrome and it paves the way for computer-aided diagnosis based on facial image analysis. In this study, we propose a novel method to detect Down syndrome using photography for computer-assisted image-based facial dysmorphology. Geometric features based on facial anatomical landmarks, local texture features based on the Contourlet transform and local binary pattern are investigated to represent facial characteristics. Then a support vector machine classifier is used to discriminate normal and abnormal cases; accuracy, precision and recall are used to evaluate the method. The comparison among the geometric, local texture and combined features was performed using the leave-one-out validation. Our method achieved 97.92% accuracy with high precision and recall for the combined features; the detection results were higher than using only geometric or texture features. The promising results indicate that our method has the potential for automated assessment for Down syndrome from simple, noninvasive imaging data.

Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively.

Super-thin clouds with optical depth smaller than ~0.3 exist globally and have significant effect on satellite remote sensing of surface temperature and atmospheric compositions, but are extremely difficult to be detected by satellite instruments. In this presentation, we report a novel method for detecting cloud particles in the atmosphere with measuring the polarized sunlight from the Earth-atmosphere system (Sun et al., 2014; Sun et al., 2015). We examined solar radiation backscattered from clouds with both satellite data and a radiative-transfer model. A distinct feature was found in the angle of linear polarization of solar radiation that is scattered from clouds at near-backscattering angles. The dominant electric field from the clear-sky Earth-atmosphere system is nearly parallel to the Earth surface at these scattering angles. However, when clouds are present, this electric field can rotate significantly away from the parallel direction. Our modeling results suggest that this polarization feature can be used to detect super-thin cirrus clouds having an optical depth of only ~0.06 and super-thin liquid water clouds having an optical depth of only ~0.01. Such clouds are too thin to be sensed using any current passive satellite instruments. This method could improve the detection of super-thin clouds and tremendously impact the remote sensing of clouds, aerosols, sea surface temperature, and atmospheric composition gases, and climate modeling. It also has potential to become an innovative satellite mission to advance Earth observation from space and improve scientific understanding of all clouds and cloud-aerosol interactions. Reference Wenbo Sun, Gorden Videen, and Michael I. Mishchenko, "Detecting super-thin clouds with polarized sunlight," Geophy. Res. Lett. 41, doi: 10.1002/2013GL058840 (2014). Wenbo Sun, Rosemary R. Baize, Gorden Videen, Yongxiang Hu, and Qiang Fu, "A method to retrieve super-thin cloud optical depth over ocean background with polarized

Intrusion detection system (IDS) is regarded as the second line of defense against network anomalies and threats. IDS plays an important role in network security. There are many techniques which are used to design IDSs for specific scenario and applications. Artificial intelligence techniques are widely used for threats detection. This paper presents a critical study on genetic algorithm, artificial immune, and artificial neural network (ANN) based IDSs techniques used in wireless sensor netw...

Low-energy neutrino physics and astrophysics has been one of the most active field of particle physics research over the past two decades, achieving important and sometimes unexpected results, which have paved the way for a bright future of further exciting studies. The methods, the techniques and the technologies employed for the construction of the many experiments which acted as important players in this area of investigation have been crucial elements to reach the many accumulated physics successes. The topic covered in this review is, thus, the description of the main features of the set of methodologies at the basis of the design, construction and operation of low-energy neutrino detectors.

Full Text Available Lorentz force eddy current testing (LET is a novel nondestructive testing technique which can be applied preferably to the identification of internal defects in nonmagnetic moving conductors. The LET is compared (similar testing conditions with the classical eddy current testing (ECT. Numerical FEM simulations have been performed to analyze the measurements as well as the identification of internal defects in nonmagnetic conductors. The results are compared with measurements to test the feasibility of defect identification. Finally, the use of LET measurements to estimate of the electrical conductors under test are described as well.

Agarose isoelectric focusing followed by blotting with nitrocellulose, nylon or polyvinylidene difluoride membranes, and immunochemical detection of cerebrospinal fluid IgG with various combinations of antisera, was evaluated. Polyvinylidene difluoride proved to be an easy-to-handle and reliable membrane for protein blotting. Among immunochemical visualization reactions, the most sensitive employed biotinylated goat anti-human IgG followed by streptavidin colloidal gold conjugate and silver enhancement in 20% w/v urea, allowing a sensitivity of less then 1 picogram IgG/band.

This paper briefly reviews other people's works on negative selection algorithm and their shortcomings. With a view to the real problem to be solved, authors bring forward two assumptions, based on which a new immune algorithm, multi-level negative selection algorithm, is developed. In essence, compared with Forrest's negative selection algorithm, it enhances detector generation efficiency. This algorithm integrates clonal selection process into negative selection process for the first time. After careful analyses, this algorithm was applied to network intrusion detection and achieved good results.

Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1 m{sup 2}/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13 MeV gamma-ray emitted in the {sup 16}O(n,n'{gamma}) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

Distributed collections are made of metadata entries that contain references to artifacts not controlled by the collection curators. These collections often have limited forms of change; for digital distributed collections, primarily creation and deletion of additional resources. However, there exists a class of digital collection that undergoes additional kinds of change. These collections consist of resources that are distributed across the Internet and brought together via hyperlinking. Resources in these collections can be expected to change as time goes on. Part of the difficulty in maintaining these collections is determining if a changed page is still a valid member of the collection. Others have tried to address this by defining a maximum allowed threshold of change, however, these methods treat change as a potential problem and treat web content as static despite its intrinsic dynamicism. Instead we acknowledge change on the web as a normal part of a web document and determine the difference between what a maintainer expects a page to do and what it actually does. In this work we evaluate options for extractors and analyzers from a suite of techniques against a human-generated ground-truth set of blog changes. The results of this work show a statistically significant improvement over traditional threshold techniques for our collection.

Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

In the frame of the French trans-governmental R&D program against chemical, biological, radiological, nuclear and explosives (CBRN-E) threats, CEA is studying the detection of Special Nuclear Materials (SNM) by neutron interrogation with fast neutrons produced by an associated particle sealed tube neutron generator. The deuterium-tritium fusion reaction produces an alpha particle and a 14 MeV neutron almost back to back, allowing tagging neutron emission both in time and direction with an alpha particle position-sensitive sensor embedded in the generator. Fission prompt neutrons and gamma rays induced by tagged neutrons which are tagged by an alpha particle are detected in coincidence with plastic scintillators. This paper presents numerical simulations performed with the MCNP-PoliMi Monte Carlo computer code and with post processing software developed with the ROOT data analysis package. False coincidences due to neutron and photon scattering between adjacent detectors (cross talk) are filtered out to increase the selectivity between nuclear and benign materials. Accidental coincidences, which are not correlated to an alpha particle, are also taken into account in the numerical model, as well as counting statistics, and the time-energy resolution of the data acquisition system. Such realistic calculations show that relevant quantities of SNM (few kg) can be distinguished from cargo and shielding materials in 10 min acquisitions. First laboratory tests of the system under development in CEA laboratories are also presented.

Full Text Available Wireless sensor network is a set of distributed sensor nodes. Which are randomly deployed in geographical area to capture climatic changes like temperature, humidity and pressure. In Wireless Network MANET is a Mobile Ad-Hoc Networks which is a one self-configurable network. MANET is a collection of Wireless mobile node which is dynamically moves from one location to another location. Both attacks Active as well as Passive attacks is in MANET. It doesn’t have a static structure. Security for wireless network is much difficult as compare to wired networks. In last few years many security and attacks issue are face many researchers in MANET. Attacks like Packet dropping attack, Black-Hole attack, Denial of Service attack, wormhole attacks and Packet modification attacks found in MANET. At the time of data communication all the above mentioned attacks access data easily without permission. To solve the problem of attacks in MANET and secure data communication use Intrusion Detection System. In This paper propose the survey of different kinds of attacks on MANET and Wireless sensor networks. This paper helps to young researcher for implement new hybrid algorithm for secure intrusion detection in MANET.

Full Text Available Effective identification of lung cancer at an initial stage is an important and crucial aspect of image processing. Several data mining methods have been used to detect lung cancer at early stage. In this paper, an approach has been presented which will diagnose lung cancer at an initial stage using CT scan images which are in Dicom (DCM format. One of the key challenges is to remove white Gaussian noise from the CT scan image, which is done using non local mean filter and to segment the lung Otsu’s thresholding is used. The textural and structural features are extracted from the processed image to form feature vector. In this paper, three classifiers namely SVM, ANN, and k-NN are applied for the detection of lung cancer to find the severity of disease (stage I or stage II and comparison is made with ANN, and k-NN classifier with respect to different quality attributes such as accuracy, sensitivity(recall, precision and specificity. It has been found from results that SVM achieves higher accuracy of 95.12% while ANN achieves 92.68% accuracy on the given data set and k-NN shows least accuracy of 85.37%. SVM algorithm which achieves 95.12% accuracy helps patients to take remedial action on time and reduces mortality rate from this deadly disease.

The problem of reliable detection of life-threatening situations in the neurosurgical patient undergoing treatment in the ICU is still far from reaching a satisfactory solution, although several methods of clinical and instrumental evaluation have recently been developed for the early detection of oncoming signs of danger. Continuous monitoring of intracranial pressure (ICP) provides neurosurgeons with valuable information about the current condition of the patient. However, it is increasingly felt that traditional methods of extracting information from the ICP signal have reached their natural limits, mostly because of difficulties in fitting the appropriate mathematical model to this non-linear and non-stationary process. Successful implementations of artificial neural networks in many medical tasks have encouraged the application of this method of ICP processing. Two problems are considered: the prediction of trends in ICP, and recognition of the configuration of unfavourable symptoms likely to signal danger for the neurosurgical patient. The construction of neural network predictors of ICP trends is based on wavelet pre-processing of the original signal. The approach to the second task involves pre-processing of the ICP with spectral and statistical methods and classification of the extracted features of the current signal on an arbitrarily selected scale of danger.

Thin-layer chromatography (TLC) is a broadly applicable separation technique. It offers many advantages over high performance liquid chromatography (HPLC), such as easily adapted for two-dimensional separation, for whole-column'' detection and for handling multiple samples, etc. However, due to its draggy development of detectiontechniques comparing with HPLC, TLC has not received the attention it deserves. Therefore, exploring new detectiontechniques is very important to the development of TLC. It is the principal of this dissertation to present a new detection method for TLC -- indirect fluorometric detection method. This detectiontechnique is universal sensitive, nondestructive, and simple. This will be described in detail from Sections 1 through Section 5. Section 1 and 3 describe the indirect fluorometric detection of anions and nonelectrolytes in TLC. In Section 2, a detection method for cations based on fluorescence quenching of ethidium bromide is presented. In Section 4, a simple and interesting TLC experiment is designed, three different fluorescence detection principles are used for the determination of caffeine, saccharin and sodium benzoate in beverages. A laser-based indirect fluorometric detectiontechnique in TLC is developed in Section 5. Section 6 is totally different from Sections 1 through 5. An ultrasonic effect on the separation of DNA fragments in agarose gel electrophoresis is investigated. 262 refs.

Positron annihilation spectroscopy (PAS) is one of the nuclear techniques used in material science. The present measurements are used to study the behavior of defect concentration in one of the most important materials aluminum alloys which is the 7075 alloy. It has been shown that positrons can become trapped at imperfect locations in solids and their mean lifetime can be influenced by changes in the concentration of such defects. No changes have been observed in the mean lifetime values after the saturation of defect concentration. The mean lifetime and trapping rates are studied for samples deformed up to 58.3%. The concentration of defect range vary from 1015 to 1018cm-3 at the thickness reduction from 2.3 to 58.3%. The dislocation density varies from 108 to 1011cm/cm3.

Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

Background Pemphigus is part of a group of blistering diseases that affect the skin and mucous membranes. Based on its autoimmune origin, autoantibodies develop in pemphigus that are directed toward cell surface components of keratinocytes. However, some data cannot be explained, such as the lack of a relationship between autoantibody levels and the severity of clinical manifestations, treatment resistance, the presence of inflammatory infiltrates and the potential occurrence of apoptosis as determinants of vesicle formation. Objective To examine the presence of apoptosis in pemphigus vulgaris by TUNEL technique. Methods In this cross-sectional study, we selected 15 paraffin-embedded tissues from subjects who were diagnosed with pemphigus vulgaris by hematoxylin and eosin staining. The samples were subjected to TUNEL assay and examined under an Olympus BX61 fluorescence microscope. Positivity was categorized dichotomously, and the statistical analysis was performed using the X2 test. Results Positivity was observed in basal layer cells in 14 (93.3%) cases. In 13 (86.7%) of the positive cases, we noted espinosum and granular layers that formed the blister roof, and in 12 cases (80%), positive acantholytic cells were observed. Conclusions TUNEL positivity was observed in pemphigus vulgaris, implicating apoptosis in the pathophysiology of this condition, which can help guide the development of apoptotic blockers as therapeutics. PMID:27438195

This paper describes state-of-the-art digital filtering techniques that are part of GEANA, an automatic data analysis software used for the GERDA experiment. The discussed filters include a novel, nonlinear correction method for ballistic deficits, which is combined with one of three shaping filters: a pseudo-Gaussian, a modified trapezoidal, or a modified cusp filter. The performance of the filters is demonstrated with a 762 g Broad Energy Germanium (BEGe) detector, produced by Canberra, that measures γ-ray lines from radioactive sources in an energy range between 59.5 and 2614.5 keV. At 1332.5 keV, together with the ballistic deficit correction method, all filters produce a comparable energy resolution of ~1.61 keV FWHM. This value is superior to those measured by the manufacturer and those found in publications with detectors of a similar design and mass. At 59.5 keV, the modified cusp filter without a ballistic deficit correction produced the best result, with an energy resolution of 0.46 keV. It is observed that the loss in resolution by using a constant shaping time over the entire energy range is small when using the ballistic deficit correction method.

This paper describes state-of-the-art digital filtering techniques that are part of the tool kit GEANA which is used as a fast automatic data validation tool for the GERDA experiment. The discussed filters include a novel, nonlinear correction method for ballistic deficits, which is combined with one of three shaping filters: the pseudo-Gaussian, a modified trapezoidal, or a modified cusp filter. The performance of the filters is demonstrated using a 762 g high purity germanium detector that measures gamma-ray lines from radioactive sources in an energy range between 59 and 2615 keV. The modified cusp filter was found to be most optimal for individual gamma-ray lines. Furthermore, it was observed, that even though, the shaping time that minimizes the energy resolution is energy dependent, the loss in resolution by using a constant shaping time over the entire energy range is small, i.e. less than 32 eV for the pseudo-Gaussian filter. This together with good energy resolutions, e.g. 1.59 keV at 1333 keV, this ...

Deterioration of asphalt road pavements is inevitable throughout its life cycle. There are several types of deterioration that take place on these surfaces, like surface defects and deformations. One of the most common asphalt defects is cracking. Fatigue, transverse, longitudinal, reflective, edge, block and slippage are types of cracking that can be observed anywhere in the world. Monitoring and preventative/periodic maintenance of these types of wears are two very important actions that have to take place to avoid "costly" solutions. This paper aims to introduce the spectral characteristics of uncracked (healthy) and cracked asphalt surfaces which can give a new asphalt crack index. This is performed through remote sensing applications in the area of asphalt pavements. Multispectral images can be elaborated using the index to enhance crack marks on asphalt surfaces. Ground spectral signatures were acquired from both uncracked and cracked asphalted areas of Cyprus (Limassol). Evaluation separability indices can be used to identify the optimum wavelength regions that can distinguish better the uncracked and cracked asphalt surfaces. The results revealed that the spectral sensitivity for the enhancement of cracked asphalt was detected using the Euclidean, Mahalanobis and Cosine Distance Indices in the Vis range (approximately at 450 nm) and in the SWIR 1 range (approximately at 1750 nm).

Formestane (4-hydroxy-androstenedione) is an aromatase inhibitor prohibited in sports and included, since 2004, in the list of prohibited substances updated yearly by the World Anti-Doping Agency (WADA). Since the endogenous production of formestane has been described, it is mandatory for the anti-doping laboratories to use isotope ratio mass spectrometry (IRMS) to establish the exogenous origin before issuing an adverse analytical finding. The described IRMS methods for formestane detection are time-consuming, requiring usually two consecutive liquid chromatographic sample purifications in order to have final extracts of adequate purity before the mass spectrometric analysis. After establishing a procedure for the determination of the origin of formestane by IRMS without the need of derivatization, and integrated in the overall analytical strategy of the laboratory for pseudo-endogenous steroids, a mass spectrometric analysis by gas chromatography-mass spectrometry (GC-MS) and gas chromatography-tandem mass spectrometry (GC-MS/MS) of formestane metabolites was carried out in order to investigate whether other biomarkers of formestane abuse could be integrated in order to avoid time-consuming and expensive IRMS confirmations for formestane. From the metabolic studies performed, the inclusion of 3β,4α-dihydroxy-5α-androstan-17-one (4α-hydroxy-epiandosterone) in the routine GC-MS procedures has demonstrated to be diagnostic in order to reduce the number of unnecessary confirmations of the endogenous origin of formestane.

This paper reports how Electro-Optics (EO) technologies such as thermal and hyperspectral [1-3] imaging methods can be used for the detection of stress remotely. Emotional or physical stresses induce a surge of adrenaline in the blood stream under the command of the sympathetic nerve system, which, cannot be suppressed by training. The onset of this alleviated level of adrenaline triggers a number of physiological chain reactions in the body, such as dilation of pupil and an increased feed of blood to muscles etc. The capture of physiological responses, specifically the increase of blood volume to pupil, have been reported by Pavlidis's pioneer thermal imaging work [4-7] who has shown a remarkable increase of skin temperature in the periorbital region at the onset of stress. Our data has shown that other areas such as the forehead, neck and cheek also exhibit alleviated skin temperatures dependent on the types of stressors. Our result has also observed very similar thermal patterns due to physical exercising, to the one that induced by other physical stressors, apparently in contradiction to Pavlidis's work [8]. Furthermore, we have found patches of alleviated temperature regions in the forehead forming patterns characteristic to the types of stressors, dependent on whether they are physical or emotional in origin. These stress induced thermal patterns have been seen to be quite distinct to the one resulting from having high fever.

The Laser Interferometer Space Antenna (LISA) mission will detect gravitational waves from galactic and extragalactic sources, most importantly those involving supermassive black holes. The primary goal of this project is to investigate stability and robustness issues associated with LISA interferometry. We specifically propose to study systematic errors arising from: optical misalignments, optical surface errors, thermal effects and pointing tolerances. This report covers the first fiscal year of the grant, from January 1st to December 31st 1999. We have employed an optical modeling tool to evaluate the effect of misplaced and misaligned optical components. Preliminary results seem to indicate that positional tolerances of one micron and angular tolerances of 0.6 millirad produce no significant effect on the achievable contrast of the interference pattern. This report also outlines research plans for the second fiscal year of the grant, from January 1st to December 31st 2000. Since the work under NAG5-6880 has gone more rapidly than projected, our test bed interferometer is operational, and can be used for measurements of effects that cause beam motion. Hence, we will design, build and characterize a sensor for measuring beam motion, and then install it. We are also planning a differential wavefront sensor based on a quadrant photodiode as a first generation sensor.

We have developed a novel instrument called the Exoplanet Tracker (ET) that can measure precise differential radial velocities, as well as barycentric radial velocities. ET is installed at the Kitt Peak 2.1 meter telescope and uses a Michelson interferometer in series with a medium resolution spectrograph. This instrument allows stellar radial velocities to be measured precisely without the use of a high resolution spectrograph. This allows the instrument to be very efficient in collecting light from the telescope. ET can achieve a radial velocity precision of 5-10 m s-1 over a 10 day observing run. A survey for extrasolar planets using the ET instrument has led to the detection of radial velocity variability for the star HD102195. Using photometry, CaII HK measurements, and precision radial velocities we demonstrate that these radial velocity variations are caused by a giant planet in a 4.11 day orbit around HD102195. A prototype monolithic interferometer has also been built for the ET instrument and is capable of delivering precise radial velocities. A large multi-object radial velocity instrument based on the ET instrument has been built and installed at the wide field Sloan 2.5 m telescope. This instrument, called the W. M. Keck Exoplanet Tracker, is capable of obtaining precise radial velocities for 59 stars simultaneously. Over the next few years this multi-object instrument will be used to conduct an All Sky ExoPlanet Survey capable of efficiently searching thousands of stars for planets.

Long QT syndrome (LQTS) is a cardiac disorder with an abnormality of cardiac rhythm associated with sudden death especially in younger, apparently healthy individuals. If there is no clear cause of death detectable during comprehensive coroner's inquest (autopsy-negative cases), you have to consider LQTS and other heritable arrhythmia syndromes. A molecular genetic screening regarding mutations in associated genes can help to ensure the cause of death and to protect affected family members. Genetic testing of LQTS, currently performed mainly by sequencing, is still very expensive and time consuming. With this study we present a rapid and reasonable method for the simultaneously screening of some of the most common mutations associated with LQTS, focused on the KCNQ1 and KCNH2 genes. With the method of SNaPshot minisequencing, a total of 58 mutations were analyzed in four multiplex assays which were successfully established and optimized. The comparison with samples previously analyzed by direct sequencing showed concordance. Furthermore, autopsy-negative cases were tested but no mutations could be observed in any of the specimen. The presented method is well suitable for LQTS mutation screening. An enhancement to further mutations and population-based investigations regarding mutation frequencies should be the aim of prospective studies.

The detection limit is one of the most important performance parameters for bioanalytical techniques. Here we present a generic method to estimate the detection limit of biomolecular assays based on a step-by-step analysis of the assay procedure. Enzyme-linked immunosorbent assay (ELISA) is used here as an example; however, much of the information presented in this article may be applied to other types of biomolecular assays and analytical techniques. A clear understanding of what affects the detection limit can help researchers to evaluate different bio-analytical techniques properly, and to design better strategies to optimize and achieve the best analytical performance.

We have recorded time-resolved LII signals from a laminar ethylene diffusion flame over a wide range of laser fluences at 532 nm. We have performed these experiments using an injection-seeded NdYAG laser with a pulse duration of 7 ns. The beam was spatially filtered and imaged into the flame to provide a homogeneous spatial profile. These data were used to aid in the development of a model, which will be used to test the validity of the LII technique under varying environmental conditions. The new model describes the heating of soot particles during the laser pulse and the subsequent cooling of the particles by radiative emission, sublimation, and conduction. The model additionally includes particle heating by oxidation, accounts for the likelihood of particle annealing, and incorporates a mechanism for nonthermal photodesorption, which is required for good agreement with our experimental results. In order to investigate the fast photodesorption mechanism in more detail, we have recorded LII temporal profiles using a regeneratively amplified Nd:YAG laser with a pulse duration of 70 ps to heat the particles and a streak camera with a temporal resolution of {approx}65 ps to collect the signal. Preliminary results confirm earlier indications of a fast mechanism leading to signal decay rates of much less than a nanosecond. Parameters to which the model is sensitive include the initial soot temperature, the temperature of the ambient gas, and the partial pressure of oxygen. In order to narrow the model uncertainties, we have developed a source of soot that allows us to determine and control these parameters. Soot produced by a burner is extracted, diluted, and cooled in a flow tube, which is equipped with a Scanning Mobility Particle Sizer (SMPS) for characterization of the aggregates.

In immersive virtual environments (IVEs), users can control their virtual viewpoint by moving their tracked head and walking through the real world. Usually, movements in the real world are mapped one-to-one to virtual camera motions. With redirection techniques, the virtual camera is manipulated by applying gains to user motion so that the virtual world moves differently than the real world. Thus, users can walk through large-scale IVEs while physically remaining in a reasonably small workspace. In psychophysical experiments with a two-alternative forced-choice task, we have quantified how much humans can unknowingly be redirected on physical paths that are different from the visually perceived paths. We tested 12 subjects in three different experiments: (E1) discrimination between virtual and physical rotations, (E2) discrimination between virtual and physical straightforward movements, and (E3) discrimination of path curvature. In experiment E1, subjects performed rotations with different gains, and then had to choose whether the visually perceived rotation was smaller or greater than the physical rotation. In experiment E2, subjects chose whether the physical walk was shorter or longer than the visually perceived scaled travel distance. In experiment E3, subjects estimate the path curvature when walking a curved path in the real world while the visual display shows a straight path in the virtual world. Our results show that users can be turned physically about 49 percent more or 20 percent less than the perceived virtual rotation, distances can be downscaled by 14 percent and upscaled by 26 percent, and users can be redirected on a circular arc with a radius greater than 22 m while they believe that they are walking straight.

The possibility of Evpatoria RT-70 planetary radar application for space debris research was tested in two trial experiments for targets at geostationary (GEO) and high-elliptic (HEO) orbits in 2001. The RT-70 has the 200 kW continuous power transmitter at 6-cm wavelength, which was used for radio location of planets. Therefore the bistatic radar system only may be realized for orbital object measurements. The receiving antennas (Bear Lakes RT-64, Svetloe RT-32, Noto RT- 32, Torun RT-32 and Urumqi RT-25) used the standard VLBI equipment for recording of echo-signals, because they have not specialized radar apparatus. Such multi-antennas configuration allows to add the classic radar data with VLBI measurements: radar has the resolution for range and radial velocity, VLBI provides the angle and angular rate. Moreover the VLBI radar may be a tool for 3D- measurements: combination of radar map and VLBI image can result the "radio holography" picture of investigated object. Seven GEO objects were detected in May session and four GEO + two HEO objects - in December session. The uncontrolled axial rotation with 5 - 20 s period was fixed for GEO targets. The first results of processing that carry out at Russia and Canada are presented. It is planned to finally adjust the VLBI radar method and t o start the regular observations under the international program of optical and radar monitoring of the near-Earth space environment that will be partially supported by INTAS-01-0669, RFBR-02-02- 17568 and RFBR-02-02-3108.

Although the outcome of cancer patients after cytotoxic chemotherapy is related diverse mechanisms, multidrug resistance (MDR) for chemotherapeutic drugs due to cellular P-glycoprotein (Pgp) or multidrug-resistance associated protein (MRP) is most important factor in the chemotherapy failure to cancer. A large number of pharmacologic compounds, including verapamil, quinidine, tamoxifen, cyclosporin A and quinolone derivatives have been reported to overcome MDR. Single photon emission computed tomography (SPECT) and positron emission tomography (PET) are available for the detection of Pgp and MRP-mediated transporter. {sup 99}m-Tc-MIBI and other {sup 99}m-Tc-radiopharmaceuticals are substrates for Pgp and MRP, and have been used in clinical studies for tumor imaging, and to visualize blockade of Pgp-mediated transport after modulation of Pgp pump. Colchicine, verapamil and daunorubicin labeled with {sup 11}C have been evaluated for the quantification of Pgp-mediated transport with PET in vivo and reported to be feasible substrates with which to image Pgp function in tumors. Leukotrienes are specific substrates for MRP and N-({sup 11}C)acetyl-leukotriene E4 provides an opportunity to study MRP function non-invasively in vivo. SPECT and PET pharmaceuticals have successfully used to evaluate pharmacologic effects of MDR modulators. Imaging of MDR and reversal of MDR with bioluminescence in a living animal is also evaluated for future clinical trial. We have described recent advances in molecular imaging of MDR and reviewed recent publications regarding feasibility of SPECT and PET imaging to study the functionality of MDR transporters in vivo.

Bifidobacteria are one of the most important bacterial groups found in the gastrointestinal tract of humans. Medical and food industry researchers have focused on bifidobacteria because of their health-promoting properties. Researchers have historically relied on classic phenotypic approaches (culture and biochemical tests) for detection and identification of bifidobacteria. Those approaches still have values for the identification and detection of some bifidobacterial species, but they are often labor-intensive and time-consuming and can be problematic in differentiating closely related species. Rapid, accurate, and reliable methods for detection, identification, and characterization of bifidobacteria in a mixed bacterial population have become a major challenge. The advent of nucleic acid-based molecular techniques has significantly advanced isolation and detection of bifidobacteria. Diverse nucleic acid-based molecular techniques have been employed, including hybridization, target amplification, and fingerprinting. Certain techniques enable the detection, characterization, and identification at genus-, species-, and strains-levels, whereas others allow typing of species or strains of bifidobacteria. In this review, an overview of methodological principle, technique complexity, and application of various nucleic acid-based molecular techniques for detection, identification, and characterization of bifidobacteria is presented. Advantages and limitations of each technique are discussed, and significant findings based on particular techniques are also highlighted.

Detecting sharp illumination variations such as shadow boundaries is an important problem for radiosity methods. Such illumination variations are captured using a nonuniform mesh that refines the areas exhibiting high illumination gradients. Nonuniform meshing techniques like discontinuity meshing o

National Aeronautics and Space Administration — In this paper we propose ν-Anomica, a novel anomaly detectiontechnique that can be trained on huge data sets with much reduced running time compared to the...

National Aeronautics and Space Administration — In this paper we propose $nu$-Anomica, a novel anomaly detectiontechnique that can be trained on huge data sets with much reduced running time compared to the...

With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

Full Text Available In recent times, most of the news from business world is dominated by financial statement fraud. A financial statement becomes fraudulent if it has some false information incorporated by the management intentionally. This paper implements data mining techniques such as CART, Naïve Bayesian classifier, Genetic Programming to identify companies those issue fraudulent financial statements. Each of these techniques is applied on a dataset from 114 companies. CART outperforms all other techniques in detection of fraud.

We present a new cavity enhanced, continuous wave spectroscopic technique for the detection of weak atomic and molecular transitions. Differential Cavity Mode Spectroscopy (DCMS) measures the difference in absorption between two adjacent cavity longitudinal modes to yield a highly sensitive, yet relatively simple, cavity enhanced spectroscopic technique. In addition this relative absorption measurement is, to first order, independent of both laser frequency noise and cavity acoustic noise. Here we present both a theoretical description of this new technique and an initial experimental demonstration.

In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

Detection of pollution gas is important in environmental and pollution monitoring, which can be used widely in mining and petrochemical industry. Fiber optical spectrum absorption (FOSA) at near-IR wavelength is widely used in gas detection due to its essential advantages. It has attracted considerable attention, and there are several types and methods in FOSA. Wavelength modulation technique (WMT) is one of them, which will improve the gas detection sensitivity dramatically. This technique can be realized by detecting the intensity of the second-harmonic component signal. Intra-cavity laser spectroscopy (ICLS) is another alternative technique for high sensitivity absorption measurement. With an absorber directly placed within the laser cavity, a short absorption cell can be transformed into a high sensitivity system. But the practical sensitivity is obviously less than the theoretical value. The authors did some works in these fields and have obtained some remarkable progress. With broad reflectors instead of FBG as mirror of the cavity and wavelength sweep technique (WST), several absorption spectra of detected gas can be collected. And the detection sensitivity can be enhanced sharply by averaging the results of each spectrum, with acetylene sensitivity less than 100 ppm . When ICLS is used combined with WST and WMT, the detection sensitivity of acetylene can be enhanced further. The sensitivity is less than 75 ppm. By using FBGs as wavelength references, the absorption wavelength of the detected gas is obtained, which can be used to realize gas recognition. The system is capable of accessing into fiber intelligent sensing network.

In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

CHEF-electrophoresis was used as a technique to detect radiation-induced DNA breakage with special emphasis to biological relevant X-ray doses (0-10 Gy). Fluorescence detection of DNA-fragments using a sensitive image analysis system was directly compared with conventional scintillation counting of

Staining 2,205 macroscopically negative blood cultures with acridine orange after 6 to 17 h of inoculation and incubation was as sensitive as an early subculture in detecting positive blood cultures. Of the 179 positive blood cultures, 30 (16.8%) were detected by acridine orange alone, 19 (10.6%) were detected by early subculture alone, 84 (46.9%) were detected by both techniques, and 46 (25.7%) were not detected by either method. The latter group includes cultures that became positive after ...

Detection of Earth surface changes are essential to monitor regional climatic, snow avalanche hazard analysis and energy balance studies that occur due to air temperature irregularities. Geographic Information System (GIS) enables such research activities to be carried out through change detection analysis. From this viewpoint, different change detection algorithms have been developed for land-use land-cover (LULC) region. Among the different change detection algorithms, change vector analysis (CVA) has level headed capability of extracting maximuminformation in terms of overall magnitude of change and the direction of change between multispectral bands from multi-temporal satellite data sets. Since past two–three decades, many effective CVA based change detectiontechniques e.g., improved change vector analysis (ICVA), modified change vector analysis (MCVA) and change vector analysis posterior-probability space (CVAPS), have been developed to overcome the difficulty that exists in traditional change vector analysis (CVA). Moreover, many integrated techniques such as cross correlogram spectral matching (CCSM) based CVA. CVA uses enhanced principal component analysis (PCA) and inverse triangular (IT) function, hyper-spherical direction cosine (HSDC), and median CVA (m-CVA), as an effective LULC change detection tools. This paper comprises a comparative analysis on CVA based change detectiontechniques such as CVA, MCVA, ICVA and CVAPS. This paper also summarizes the necessary integrated CVA techniques along with their characteristics, features and shortcomings. Based on experiment outcomes, it has been evaluated that CVAPS technique has greater potential than other CVA techniques to evaluate the overall transformed information over three differentMODerate resolution Imaging Spectroradiometer (MODIS) satellite data sets of different regions. Results of this study are expected to be potentially useful for more accurate analysis of LULC changes which will, in turn

The formalin-Tween sedimentation method was compared with the formalin-ether sedimentation for parasitic detection. Of a total 297 fecal specimens examined, 72.1% were positive. The formalin-tween technique was effective for ascertaining helminths, particularly Ascaris lumbricoides, Trichuris trichiura and hookworm eggs; however it has less capability for protozoa detection. This method is simple, inexpensive, less time consuming and highly sensitive when detecting the parasitic infection, pa...

Active thermography techniques have the capability of inspecting a broad range simultaneously. By evaluating the phase difference between the defected area and the healthy area, the technique indicates the qualitative location and size of the defect. Previously, the development of the defect detection method used a variety of materials and the test specimen was done. In this study, the proposed technique of lock-in is verified with artificial specimens that have different size and depth of subsurface defects. Finally, the defect detection capability was evaluated using comparisons of the phase image and the amplitude image according to the size and depth of defects.

The nuclear quadrupole resonance (NQR) method has been applied to Heroin Base (HB) to find an optimised multi-pulse technique for effective detection of HB. Experimental results of applying the proposed spin-locking multi-pulse (SLMP) technique to nitrogen-14 NQR in this sample are presented and convincingly demonstrate as a path towards efficient detection. A detection using a sequence of this character could be achieved over real-world scan volumes for screening of goods. All experiments were carried out at room temperature.

Full Text Available The study describes the basic network attacks such as "denial of service", algorithm of operation of malefactors with attacks of this type, techniques of detection of DOS/DDOS/DRDOS attacks in networks of mass service. For detection of DDoS-attacks is offered the valuation method of probability of loss of arbitrary request in case of its passing on networks of mass service. Developed the architecture and constructed program implementation of system of detection of DDoS-attacks. The developed technique allows to receive an adequate assessment of frequency of loss of requests on a network if the network of mass service is in the stationary mode.

Full Text Available Introduction: Although proximal dental caries are very common, clinical examinations cannot detect them all. Panoramic radiography has been widely used in dentistry for both diagnosis and screening. This study aimed to investigate and compare the efficacy of two digital panoramic radiography techniques in the diagnosis of proximal caries. Methods: A total number of 60 patients referred to a dental radiology center, all had complete dental system and bitewing radiographies, were included. The patients were randomly divided into two groups of 30 patients. For the first and second groups, CR and DR images were obtained respectively. Images were obtained from the distal of the third tooth to the distal of the eighth. Bitewing images were compared with CR and DR images regarding the detection of caries. Kappa index and chi-squared statistics were employed to analyze the results. Results: There was a high agreement rate between bitewing images and CR (Kappa=0.775 and DR (Kappa=o.762 images in detecting caries. Also no significant difference was shown between CR and DR techniques in the detection of caries (0.543. However, DR and CR images are not efficient enough to be prescribed as the sole imaging technique to detect proximal caries. Conclusion: DR and CR techniques could be good imaging techniques for the detection of dental caries as a companion to clinical examinations

Full Text Available Introduction: Although proximal dental caries are very common, clinical examinations cannot detect them all. Panoramic radiography has been widely used in dentistry for both diagnosis and screening. This study aimed to investigate and compare the efficacy of two digital panoramic radiography techniques in the diagnosis of proximal caries. Methods: A total number of 60 patients referred to a dental radiology center, all had complete dental system and bitewing radiographies, were included. The patients were randomly divided into two groups of 30 patients. For the first and second groups, CR and DR images were obtained respectively. Images were obtained from the distal of the third tooth to the distal of the eighth. Bitewing images were compared with CR and DR images regarding the detection of caries. Kappa index and chi-squared statistics were employed to analyze the results. Results: There was a high agreement rate between bitewing images and CR (Kappa=0.775 and DR (Kappa=o.762 images in detecting caries. Also no significant difference was shown between CR and DR techniques in the detection of caries (0.543. However, DR and CR images are not efficient enough to be prescribed as the sole imaging technique to detect proximal caries. Conclusion: DR and CR techniques could be good imaging techniques for the detection of dental caries as a companion to clinical examinations.

Based on the transmitting theory of 'smoke ring effect', the transient electromagnetism technique was used in coal mines to detect abnormal areas of aquiferous structures in both roofs and floors of coal seams and in front of excavated roadways. The TerraTEM transient electromagnetic equipment, newly developed in Australia, was used. Survey devices, working methods and techniques as well as data processing and interpretation are discussed systematically. In addition, the direction of mini-wireframe emission electromagnetic wave of the full space transient electromagnetism technique was verified by an underground borehole for water detection and drainage. The result indicates that this technique can detect both horizontal and vertical development rules of abnormal water bodies to a certain depth below the floor of coal seams and can also detect the abnormal, low resistance water bodies within a certain distance of roofs. Furthermore, it can detect such abnormal bodies in ahead of the excavated roadway front. Limited by the underground environment, the full space transient electromagnetism technique can detect to a depth of only 120 m or so. 7 refs., 7 figs.

Based on the transmitting theory of "smoke ring effect", the transient electromagnetism technique was used in coal mines to detect abnormal areas of aquiferous structures in both roofs and floors of coal seams and in front of excavated roadways. Survey devices, working methods and techniques as well as data processing and interpretation are discussed systematically. In addition, the direction of mini-wireframe emission electromagnetic wave of the full space transient electromagnetism technique was verified by an underground borehole for water detection and drainage. The result indicates that this technique can detect both horizontal and vertical development rules of abnormal water bodies to a certain depth below the floor of coal seams and can also detect the abnormal, low resistance water bodies within a certain distance of roofs. Furthermore, it can detect such abnormal bodies in ahead of the excavated roadway front.Limited by the underground environment, the full space transient electromagnetism technique can detect to a depth of only 120 m or so.

Full Text Available Detecting land use or land cover changes is a challenging problem in analyzing images. Change-detection plays a fundamental role in most of land use or cover monitoring systems using remote-sensing techniques. The reliability of individual automatic change-detection algorithms is currently below operating requirements when considering the intrinsic uncertainty of a change-detection algorithm and the complexity of detecting changes in remote-sensing images. In particular, most of these algorithms are only suited for a specific image data source, study area and research purpose. Only a number of comprehensive change-detection methods that consider the reliability of the algorithm in different implementation situations have been reported. This study attempts to explore the advantages of combining several typical change-detection algorithms. This combination is specifically designed for a highly reliable change-detection task. Specifically, a fusion approach based on reliability is proposed for an exclusive land use or land cover change-detection. First, the reliability of each candidate algorithm is evaluated. Then, a fuzzy comprehensive evaluation is used to generate a reliable change-detection approach. This evaluation is a transformation between a one-way evaluation matrix and a weight vector computed using the reliability of each candidate algorithm. Experimental results reveal that the advantages of combining these distinct change-detectiontechniques are evident.

Mass spectrometry-based proteomic approaches have been used to develop methodologies capable of detecting the abuse of protein therapeutics such as recombinant human erythropoietin and recombinant human growth hormone. Existing detection methods use antibody-based approaches that, although effective, suffer from long assay development times and specificity issues. The application of liquid chromatography with tandem mass spectrometry and selected reaction-monitoring-based analysis has demonstrated the ability to detect and quantify existing protein therapeutics in plasma. Furthermore, the multiplexing capability of selected reaction-monitoring analysis has also aided in the detection of multiple downstream biomarkers in a single analysis, requiring less sample than existing immunological techniques. The flexibility of mass spectrometric instrumentation has shown that the technique is capable of detecting the abuse of novel and existing protein therapeutics, and has a vital role in the fight to keep sports drug-free.

Quantitative detection of molecules of interest from biological and environmental samples in a rapid manner, particularly with a relevant concentration range, is imperative to the timely assessment of human diseases and environmental issues. In this work, we employed the microwave-accelerated bioassay (MAB) technique, which is based on the combined use of circular bioassay platforms and microwave heating, for rapid and quantitative detection of Glial Fibrillary Acidic Protein (GFAP) and Shiga like toxin (STX 1). The proof-of-principle use of the MAB technique with the circular bioassay platforms for the rapid detection of GFAP in buffer based on colorimetric and fluorescence readouts was demonstrated with a 900W kitchen microwave. We also employed the MAB technique with a new microwave system (called the iCrystal system) for the detection of GFAP from mice with brain injuries and STX 1 from a city water stream. Control bioassays included the commercially available gold standard bioassay kits run at room temperature. Our results show that the lower limit of detection (LLOD) of the colorimetric and fluorescence based bioassays for GFAP was decreased by ~1000 times using the MAB technique and our circular bioassay platforms as compared to the commercially available bioassay kits. The overall bioassay time for GFAP and STX 1 was reduced from 4h using commercially available bioassay kits to 10min using the MAB technique.

Human papillomaviruses (HPV), double-stranded DNA viruses, are causing many mucocutaneous diseases, benign or malignant, ranging from common warts to malignancies involving the upper aerodigestive tract and the anogenital sphere. The diagnosis of HPV infection is based primarily on the viral genome detection by molecular biological methods, given the difficulty in routine cultivation of these viruses. The current trend in screening against cervical cancer is to improve the sensitivity of screening with new methods and to propose new algorithms for diagnostic and therapeutic decisions. The development of liquid-based cytology facilitates the cytologic diagnosis and molecular assays from the same sample. There are two main types of HPV detection methods used on uterine cervical samples: signal amplification methods (hybridization techniques in liquid phase) and target amplification methods (the techniques of gene amplification or Polymerase Chain Reaction [PCR]). Genotyping techniques are also developed: they are based on an amplification technique followed by hybridization with probe specific types. In addition to the detection, genotyping techniques allow quantitative detection of viral DNA of HPV genotype and so monitor changes in viral load over time. Another approach relies on the detection of messenger RNA (mRNA) of HPV proteins E6 and E7 oncogenes, which would appear to be a relevant marker to identify and monitor women at risk of progression to a precancerous lesion or cervical cancer.

Chemotherapy-induced cardiotoxicity has become a significant public health issue. Left ventricular ejection fraction is routinely used to monitor cardiotoxicity but fails to detect subtle alterations in cardiac function. Improvements in the measurement of left ventricular ejection fraction, physical or pharmacologic stressors, and novel cardiac functional indices may be useful in the detection of cardiotoxicity. The improvements in the detection and therapy of cancer have led to the emergence of chemotherapy-induced cardiotoxicity. New echocardiographic techniques may be useful in the detection of patients undergoing chemotherapy treatments who could benefit from alternative cancer treatments, therefore decreasing the incidence of cardiotoxicity.

Reactive impinging flow (RIF) is a novel quality-control method for defect detection (i.e., reduction in Pt catalyst loading) in gas-diffusion electrodes (GDEs) on weblines. The technique uses infrared thermography to detect temperature of a nonflammable (<4% H2) reactive mixture of H2/O2 in N2 impinging and reacting on a Pt catalytic surface. In this paper, different GDE size defects (with catalyst-loading reductions of 25, 50, and 100%) are detected at various webline speeds (3.048 and 9.144 m min-1) and gas flowrates (32.5 or 50 standard L min-1). Furthermore, a model is developed and validated for the technique, and it is subsequently used to optimize operating conditions and explore the applicability of the technique to a range of defects. The model suggests that increased detection can be achieved by recting more of the impinging H2, which can be accomplished by placing blocking substrates on the top, bottom, or both of the GDE; placing a substrate on both results in a factor of four increase in the temperature differential, which is needed for smaller defect detection. Overall, the RIF technique is shown to be a promising route for in-line, high-speed, large-area detection of GDE defects on moving weblines.

Reactive impinging flow (RIF) is a novel quality-control method for defect detection (i.e., reduction in Pt catalyst loading) in gas-diffusion electrodes (GDEs) on weblines. The technique uses infrared thermography to detect temperature of a nonflammable (catalyst-loading reductions of 25, 50, and 100%) are detected at various webline speeds (3.048 and 9.144 m min-1) and gas flowrates (32.5 or 50 standard L min-1). Furthermore, a model is developed and validated for the technique, and it is subsequently used to optimize operating conditions and explore the applicability of the technique to a range of defects. The model suggests that increased detection can be achieved by recting more of the impinging H2, which can be accomplished by placing blocking substrates on the top, bottom, or both of the GDE; placing a substrate on both results in a factor of four increase in the temperature differential, which is needed for smaller defect detection. Overall, the RIF technique is shown to be a promising route for in-line, high-speed, large-area detection of GDE defects on moving weblines.

Full Text Available Reported for the first time are receiver operating characteristic (ROC curves constructed to describe the performance of a sorbent-coated disk, planar solid phase microextraction (PSPME unit for non-contact sampling of a variety of volatiles. The PSPME is coupled to ion mobility spectrometers (IMSs for the detection of volatile chemical markers associated with the presence of smokeless powders, model systems of explosives containing diphenylamine (DPA, 2,4-dinitrotoluene (2,4-DNT and nitroglycerin (NG as the target analytes. The performance of the PSPME-IMS was compared with the widely accepted solid-phase microextraction (SPME, coupled to a GC-MS. A set of optimized sampling conditions for different volume containers (1–45 L with various sample amounts of explosives, were studied in replicates (n = 30 to determine the true positive rates (TPR and false positive detection rates (FPR for the different scenarios. These studies were obtained in order to construct the ROC curves for two IMS instruments (a bench-top and field-portable system and a bench top GC-MS system in low and high clutter environments. Both static and dynamic PSPME sampling were studied in which 10–500 mg quantities of smokeless powders were detected within 10 min of static sampling and 1 min of dynamic sampling.

Full Text Available The intrusion detection system (IDS is an important network security tool for securing computer and network systems. It is able to detect and monitor network traffic data. Snort IDS is an open-source network security tool. It can search and match rules with network traffic data in order to detect attacks, and generate an alert. However, the Snort IDS can detect only known attacks. Therefore, we have proposed a procedure for improving Snort IDS rules, based on the association rules data mining technique for detection of network probe attacks. We employed the MIT-DARPA 1999 data set for the experimental evaluation. Since behavior pattern traffic data are both normal and abnormal, the abnormal behavior data is detected by way of the Snort IDS. The experimental results showed that the proposed Snort IDS rules, based on data mining detection of network probe attacks, proved more efficient than the original Snort IDS rules, as well as icmp.rules and icmp-info.rules of Snort IDS. The suitable parameters for the proposed Snort IDS rules are defined as follows: Min_sup set to 10%, and Min_conf set to 100%, and through the application of eight variable attributes. As more suitable parameters are applied, higher accuracy is achieved.

The aims of this study is to detect the outliers by using coefficients of Ordinal Logistic Regression (OLR) for the case of k category responses where the score from 1 (the best) to 8 (the worst). We detect them by using the sum of moduli of the ordinal regression coefficients calculated by jackknife technique. This technique is improved by scalling the regression coefficients to their means. R language has been used on a set of ordinal data from reference distribution. Furthermore, we compare this approach by using studentised residual plots of jackknife technique for ANOVA (Analysis of Variance) and OLR. This study shows that the jackknifing technique along with the proper scaling may lead us to reveal outliers in ordinal regression reasonably well.

Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas.

Full Text Available Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1 we use a terrain drop compensation (TDC technique to solve the problem of the concavity of railway crossings; (2 we use a linear regression technique to predict the position and length of an object from image processing; (3 we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas.

Full Text Available The growth of internet technology spread a large amount of data communication. The communication of data compromised network threats and security issues. The network threats and security issues raised a problem of data integrity and loss of data. For the purpose of data integrity and loss of data before 20 year Anderson developed a model of intrusion detection system. Initially intrusion detection system work on process of satirical frequency of audit system logs. Latter on this system improved by various researchers and apply some other approach such as data mining technique, neural network and expert system. Now in current research trend of intrusion detection system used soft computing approach such as fuzzy logic, genetic algorithm and machine learning. In this paper discuss some method of data mining and soft computing for the purpose of intrusion detection. Here used KDDCUP99 dataset used for performance evaluation for this technique.

An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.

Security and distributed infrastructure are two of the most common requirements for big data software. But the security features of the big data platforms are still premature. It is critical to identify, modify, test and execute some of the existing security mechanisms before using them in the big data world. In this paper, we propose a novel intrusion detectiontechnique that understands and works according to the needs of big data systems. Our proposed technique identifies program level ano...

A technique was developed for the detection of antifungal activity of proteins after discontinuous polyacrylamide gel electrophoresis under native conditions. The antifungal activity is detected as growth inhibition zones in a homogeneous fungal lawn, grown in an agar layer spread on top of the polyacrylamide gel. The position of proteins with antifungal activity can be determined on a diffusion blot prepared from the same gel. The technique is illustrated for three antifungal plant proteins, i.e. alpha-purothionin, Urtica dioica agglutinin, and tobacco chitinase.

Full Text Available Low and non-metallic landmines are one of the most difficult subsurface targets to be detected using several geophysical techniques. Ground penetrating radar (GPR performance at different field sites shows great success in detecting metallic landmines. However significant limitations are taking place in the case of low and non-metallic landmines. Electrical resistivity imaging (ERI technique is tested to be an alternative or confirmation technique for detecting the metallic and non-metallic landmines in suspicious cleared areas. The electrical resistivity responses using forward modeling for metallic and non-metallic landmines buried in dry and wet environments utilizing the common electrode configurations have been achieved. Roughly all the utilized electrode arrays can establish the buried metallic and plastic mines correctly in dry and wet soil. The accuracy differs from one array to the other based on the relative resistivity contrast to the host soil and the subsurface distribution of current and potential lines as well as the amplitude of the noises in the data. The ERI technique proved to be fast and effective tool for detecting the non-metallic mines especially in the conductive environment whereas the performances of the other metal detector (MD and GPR techniques show great limitation.

For the first time, a direct detection BOTDR is demonstrated for distributed dynamic strain sensing incorporating double-edge technique, time-division multiplexing technique and upconversion technique. The double edges are realized by using the transmission curve and reflection curve of an all-fiber Fabry-Perot interferometer (FPI). Benefiting from the low loss of the fiber at, the time-division multiplexing technique is performed to realize the double-edge technique by using only a single-channel FPI and only one piece of a detector. In order to detect the weak spontaneous Brillouin backscattering signal efficiently, a fiber-coupled upconversion detector is adopted to upconvert the backscattering signal at 1548.1 nm to 863 nm, which is detected by a Si-APD finally. In the experiment, dynamic strain disturbance up to 1.9m{\\epsilon} over 1.5 km of polarization maintaining fiber is detected at a sampling rate of 30 Hz. An accuracy of 30{\\mu}{\\epsilon} and spatial resolution of 0.6 m is realized.

Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

Pulsed voltammetry has been used to detect and quantify glyphosate on buffered water in presence of ammonium nitrate and humic substances. Glyphosate is the most widely used herbicide active ingredient in the world. It is a non-selective broad spectrum herbicide but some of its health and environmental effects are still being discussed. Nowadays, glyphosate pollution in water is being monitored but quantification techniques are slow and expensive. Glyphosate wastes are often detected in count...

International audience; Mobile traffic data has been recently used to characterize the urban environment in terms of urban fabric profiles. While showing promising results, the existing urban fabric detection solutions are built without a clear understanding of the detection process chain. In this paper, we distinguish and analyze the different steps common to all urban profiling techniques. By evaluating the impact of each step of the process, we are able to propose a new solution that outpe...

Abstract Biomarkers are essential part of daily medical practice. Currently, biomarkers are being used both for diagnostic and prognostic purposes. There are many approaches e.g. ELISA by which biomarker levels are detected from patient samples. However, all these approaches are laborious, time consuming and expensive. There is therefore a general need for exploring new technique which can overcome these drawbacks. Here, we present a preliminary study for detection of serum biomarkers by fluo...

5g , SCR = 0 dB. 4.1.2 Multiple Targets To demonstrate the detection algorithm for multiple targets, a similar simulated scenario is presented with...technique algorithm is easier to implement making the processing faster. The work also includes detailed detection algorithms for single targets, and for...signal for varying accelerations. . . . . . . . . . . 14 9 Kurtosis for a chirp signal corresponding to a target with an acceleration of 5g for

Full Text Available Sensitivity of field tests (AgriStrip and Immunochromato, DAS-ELISA, two step RT-PCR and real-time RT-PCR for Plum pox virus (PPV detection was tested in various tissues of apricot, peach, plum and damson plum trees infected with isolates belonging to PPV-D, PPV-M or PPV-Rec, the three strains present in Slovenia. Flowers of apricot and plum in full bloom proved to be a very good source for detection of PPV. PPV could be detected with all tested techniques in symptomatic parts of leaves in May and with one exception even in the beginning of August, but it was not detected in asymptomatic leaves using field tests, DAS-ELISA and partly also molecular techniques. PPV was detected only in some of the samples of asymptomatic parts of the leaves with symptoms and of stalks by field tests and DAS-ELISA. Infections were not detected in buds in August using field tests or DAS-ELISA. Field tests are useful for confirmation of the PPV infection in symptomatic leaves, but in tissues without symptoms DAS-ELISA should be combined or replaced by molecular techniques.

Pure cultures, including Desulfovibrio vulgaris and Methanococcus maripaludus, were combined with model oil samples and oil/diesel mixtures to optimize extraction techniques of signature lipids from oil in support of investigation of microbial communities in oil deposit samples targets for microbial enhanced hydrocarbon recovery. Several techniques were evaluated, including standard phospholipid extraction, ether linked lipid for Archaeal bacterial detection, and high pressure extractiontechniques. Recovery of lipids ranged from 50-80percent as compared to extraction of the pure culture. Extraction efficiency was evaluated by the use of internal standards. Field samples will also be tested for recovery of signature lipids with optimized extraction techniques.

In this paper, we describe a novel technique based on the flipped-exponential (FE) Nyquist pulse method for reducing peak-to-average power ratio (PAPR) in an optical direct-detection orthogonal frequency-division multiplexing (DD-QFDM) system, The technique involves proper selection of the FE Nyquist pulses for shaping the different subcarriers of the OFDM. We apply this technique to a DD-OFDM transmission system to significantly reduce PAPR. We also investigate the sensitivity of a received OFDM signal with strong nonlinearity in a standard single-mode fiber (SMF).

Full Text Available Wireless sensor network (WSN consists of sensor nodes. Deployed in the open area, and characterized by constrained resources, WSN suffers from several attacks, intrusion and security vulnerabilities. Intrusion detection system (IDS is one of the essential security mechanism against attacks in WSN. In this paper we present a comparative evaluation of the most performant detectiontechniques in IDS for WSNs, the analyzes and comparisons of the approaches are represented technically, followed by a brief. Attacks in WSN also are presented and classified into several criteria. To implement and measure the performance of detectiontechniques we prepare our dataset, based on KDD'99, into five step, after normalizing our dataset, we determined normal class and 4 types of attacks, and used the most relevant attributes for the classification process. We propose applying CfsSubsetEval with BestFirst approach as an attribute selection algorithm for removing the redundant attributes. The experimental results show that the random forest methods provide high detection rate and reduce false alarm rate. Finally, a set of principles is concluded, which have to be satisfied in future research for implementing IDS in WSNs. To help researchers in the selection of IDS for WSNs, several recommendations are provided with future directions for this research.

Fault detection and diagnosis plays a pivotal role in the health-monitoring techniques for liquid- propellant rocket engines. This paper firstly gives a brief summary on the techniques of fault detection and diagnosis utilized in liquid-propellant rocket engines. Then, the applications of fault detection and diagnosis algorithms studied and developed to the Long March Main Engine System(LMME) are introduced. For fault detection, an analytical model-based detection algorithm, a time-series-analysis algorithm and a startup- transient detection algorithm based on nonlinear identification developed and evaluated through ground-test data of the LMME are given. For fault diagnosis, neural-network approaches, nonlinear-static-models based methods, and knowledge-based intelligent approaches are presented. Keywords: Fault detection; Fault diagnosis; Health monitoring; Neural networks; Fuzzy logic; Expert system; Long March main engines Contact author and full address: Dr. Jianjun Wu Department of Astronautical Engineering School of Aerospace and Material Engineering National University of Defense Technology Changsha, Hunan 410073 P.R.China Tel:86-731-4556611(O), 4573175(O), 2219923(H) Fax:86-731-4512301 E-mail:jjwu@nudt.edu.cn

Fingerprint evidence offers great value to criminal investigations since it is an internationally recognized and established means of human identification. With recent advances in modern technology, scientists have started analyzing not only the ridge patterns of fingerprints but also substances which can be found within them. The aim of this work was to determine whether Fourier transform infrared (FTIR) spectromicroscopy could be used to detect contamination in a fingerprint which was dusted with powder (a technique already recognized as an effective and reliable method for developing latent fingerprints) and subsequently lifted off with adhesive tape. Explosive materials (pentaerythritol tetranitrate, C-4, TNT) and noncontrolled substances (sugar, aspirin) were used to prepare contaminated fingerprints on various substrates. Freshly deposited fingermarks with powders which were lifted off with adhesive tapes (provided by Singapore Police Force) were analyzed using a Bruker Hyperion 2000 microscope at the ISMI beamline (Singapore Synchrotron Light Source) with an attenuated total reflection objective. FTIR spectroscopy is a nondestructive technique which requires almost no sample preparation. Further, the fingerprint under analysis remains in pristine condition, allowing subsequent analysis if necessary. All analyzed substances were successfully distinguished using their FTIR spectra in powdered and lifted fingerprints. This method has the potential to significantly impact forensic science by greatly enhancing the information that can be obtained from the study of fingerprints.

Information on weed distribution within the field is necessary to implement spatially variable herbicide application. This paper deals with the development of near-ground image capture and processing techniques in order to detect broad-leaved weeds in cereal crops under actual field conditions. T...

The research described in this thesis concerns the development and application in food microbiology of molecular identification and detectiontechniques based on 16S rRNA sequences. The technologies developed were applied to study the microbial ecology of two groups of bacteria, namely

The development of techniques for the health monitoring of the rotating components in gas turbine engines is of major interest to NASA s Aviation Safety Program. As part of this on-going effort several experiments utilizing a novel optical Moir based concept along with external blade tip clearance and shaft displacement instrumentation were conducted on a simulated turbine engine disk as a means of demonstrating a potential optical crack detectiontechnique. A Moir pattern results from the overlap of two repetitive patterns with slightly different periods. With this technique, it is possible to detect very small differences in spacing and hence radial growth in a rotating disk due to a flaw such as a crack. The experiment involved etching a circular reference pattern on a subscale engine disk that had a 50.8 mm (2 in.) long notch machined into it to simulate a crack. The disk was operated at speeds up to 12 000 rpm and the Moir pattern due to the shift with respect to the reference pattern was monitored as a means of detecting the radial growth of the disk due to the defect. In addition, blade displacement data were acquired using external blade tip clearance and shaft displacement sensors as a means of confirming the data obtained from the optical technique. The results of the crack detection experiments and its associated analysis are presented in this paper.

Full Text Available The paper seeks to investigate the use of scalable machine learning techniques to address anomaly detection problem in a large Wi-Fi network. This was in the efforts of achieving a highly scalable preemptive monitoring tool for wireless networks...

The research described in this thesis concerns the development and application in food microbiology of molecular identification and detectiontechniques based on 16S rRNA sequences. The technologies developed were applied to study the microbial ecology of two groups of bacteria, namely starter cultu

The research described in this thesis concerns the development and application in food microbiology of molecular identification and detectiontechniques based on 16S rRNA sequences. The technologies developed were applied to study the microbial ecology of two groups of bacteria, namely star

In this paper, we present a novel technique for automatic and efficient intrusion detection based on learning program behaviors. Program behavior is captured in terms of issued system calls augmented with point-of-system-call information, and is modeled according to an efficient deterministic

We report a detection of a faint near-Earth asteroid (NEA) using our synthetic tracking technique and the CHIMERA instrument on the Palomar 200 inch telescope. With an apparent magnitude of 23 (H = 29, assuming detection at 20 lunar distances), the asteroid was moving at 6.°32 day{sup –1} and was detected at a signal-to-noise ratio (S/N) of 15 using 30 s of data taken at a 16.7 Hz frame rate. The detection was confirmed by a second observation 77 minutes later at the same S/N. Because of its high proper motion, the NEA moved 7 arcsec over the 30 s of observation. Synthetic tracking avoided image degradation due to trailing loss that affects conventional techniques relying on 30 s exposures; the trailing loss would have degraded the surface brightness of the NEA image on the CCD down to an approximate magnitude of 25 making the object undetectable. This detection was a result of our 12 hr blind search conducted on the Palomar 200 inch telescope over two nights, scanning twice over six (5.°3 × 0.°046) fields. Detecting only one asteroid is consistent with Harris's estimates for the distribution of the asteroid population, which was used to predict a detection of 1.2 NEAs in the H-magnitude range 28-31 for the two nights. The experimental design, data analysis methods, and algorithms are presented. We also demonstrate milliarcsecond-level astrometry using observations of two known bright asteroids on the same system with synthetic tracking. We conclude by discussing strategies for scheduling observations to detect and characterize small and fast-moving NEAs using the new technique.

The objective of the research in this dissertation is to develop a software system to automatically detect and characterize solar flares, filaments and Corona Mass Ejections (CMEs), the core of so-called solar activity. These tools will assist us to predict space weather caused by violent solar activity. Image processing and pattern recognition techniques are applied to this system. For automatic flare detection, the advanced pattern recognition techniques such as Multi-Layer Perceptron (MLP), Radial Basis Function (RBF), and Support Vector Machine (SVM) are used. By tracking the entire process of flares, the motion properties of two-ribbon flares are derived automatically. In the applications of the solar filament detection, the Stabilized Inverse Diffusion Equation (SIDE) is used to enhance and sharpen filaments; a new method for automatic threshold selection is proposed to extract filaments from background; an SVM classifier with nine input features is used to differentiate between sunspots and filaments. Once a filament is identified, morphological thinning, pruning, and adaptive edge linking methods are applied to determine filament properties. Furthermore, a filament matching method is proposed to detect filament disappearance. The automatic detection and characterization of flares and filaments have been successfully applied on Halpha full-disk images that are continuously obtained at Big Bear Solar Observatory (BBSO). For automatically detecting and classifying CMEs, the image enhancement, segmentation, and pattern recognition techniques are applied to Large Angle Spectrometric Coronagraph (LASCO) C2 and C3 images. The processed LASCO and BBSO images are saved to file archive, and the physical properties of detected solar features such as intensity and speed are recorded in our database. Researchers are able to access the solar feature database and analyze the solar data efficiently and effectively. The detection and characterization system greatly improves

Sampling techniques to detect airborne Salmonella species (spp.) in two pilot scale broiler houses were compared. Broilers were inoculated at seven days of age with a marked strain of Salmonella enteritidis. The rearing cycle lasted 42 days during the summer. Airborne Salmonella spp. were sampled weekly using impaction, gravitational settling, and impingement techniques. Additionally, Salmonella spp. were sampled on feeders, drinkers, walls, and in the litter. Environmental conditions (temperature, relative humidity, and airborne particulate matter (PM) concentration) were monitored during the rearing cycle. The presence of Salmonella spp. was determined by culture-dependent and molecular methods. No cultivable Salmonella spp. were recovered from the poultry houses' surfaces, the litter, or the air before inoculation. After inoculation, cultivable Salmonella spp. were recovered from the surfaces and in the litter. Airborne cultivable Salmonella spp. Were detected using impaction and gravitational settling one or two weeks after the detection of Salmonella spp. in the litter. No cultivable Salmonella spp. were recovered using impingement based on culture-dependent techniques. At low airborne concentrations, the use of impingement for the quantification or detection of cultivable airborne Salmonella spp. is not recommended. In these cases, a combination of culture-dependent and culture-independent methods is recommended. These data are valuable to improve current measures to control the transmission of pathogens in livestock environments and for optimising the sampling and detection of airborne Salmonella spp. in practical conditions.

A method of detecting multiple barcodes simultaneously using time division multiplexing technique has been proposed in this paper to minimize the effective time needed for handling multiple tags of barcodes and to lessen the overall workload. Available barcode detection systems can handle multiple types of barcode but a single barcode at a time. This is not so efficient and can create large queue and chaos in places like mega shopping malls or large warehouses where we need to scan huge number of barcodes daily. Our proposed system is expected to improve the real time identification of goods or products on production lines and in automated warehouses or in mega shopping malls in a much more convenient and efficient way. For identifying of multiple barcodes simultaneously, a particular arrangement and orientation of LASER scanner and reflector have been used with a special curve shaped basement where the barcodes are placed. An effective and novel algorithm is developed to extract information from multiple barcodes which introduces starting pattern and ending pattern in barcodes with bit stuffing technique for the convenience of multiple detections. CRC technique is also used for trustworthiness of detection. The overall system enhances the existing single barcode detection system by a great amount although it is easy to implement and use.

We present a new cluster detection algorithm designed for finding high-redshift clusters using optical/infrared imaging data. The algorithm has two main characteristics. First, it utilises each galaxy's full redshift probability function, instead of an estimate of the photometric redshift based on the peak of the probability function and an associated Gaussian error. Second, it identifies cluster candidates through cross-checking the results of two substantially different selection techniques (the name 2TecX representing the cross-check of the two techniques). These are adaptations of the Voronoi Tesselations and Friends-Of-Friends methods. Monte-Carlo simulations of mock catalogues show that cross-checking the cluster candidates found by the two techniques significantly reduces the detection of spurious sources. Furthermore, we examine the selection effects and relative strengths and weaknesses of either method. The simulations also allow us to fine-tune the algorithm's parameters, and define completeness an...

Full Text Available Database driven by interactive web applications are at risk of SQL Injection Attacks (SQLIA these applications accept user inputs and use them to form SQL statements. During SQL injection process the attacker inputs malicious SQL query segments which will result in different database request. SQLIA can be use to bypass authentication control and also extract and/or modify valuable information. In order to encounter such type of threats different techniques are purposed by researchers but most of the implemented approaches which usesanomaly detection model have very high false alert. In this paper we have analyze existing detectiontechniques that uses Chi-square test. And we have evaluated these techniques against SQLIA.

Full Text Available Detecting abnormal user activity in social network websites could prevent from cyber-crime occurrence. The previous research focused on data mining while this research is based on user behavior process. In this study, the first step is defining a normal user behavioral pattern and the second step is detecting abnormal behavior. These two steps are applied on a case study that includes real and syntactic data sets to obtain more tangible results. The chosen technique used to define the pattern is process mining, which is an affordable, complete and noise-free event log. The proposed model discovers a normal behavior by genetic process mining technique and abnormal activities are detected by the fitness function, which is based on Petri Net rules. Although applying genetic mining is time consuming process, it can overcome the risks of noisy data and produces a comprehensive normal model in Petri net representation form.

Skin cancer is the most common cancer in the Western world. In order to accurately detect the disease, especially malignant melanoma-the most fatal form of skin cancer-at an early stage when the prognosis is excellent, there is an urgent need to develop noninvasive early detection methods. We believe that polarization speckle patterns, defined as a spatial distribution of depolarization ratio of traditional speckle patterns, can be an important tool for skin cancer detection. To demonstrate our technique, we conduct a large in vivo clinical study of 214 skin lesions, and show that statistical moments of the polarization speckle pattern could differentiate different types of skin lesions, including three common types of skin cancers, malignant melanoma, squamous cell carcinoma, basal cell carcinoma, and two benign lesions, melanocytic nevus and seborrheic keratoses. In particular, the fourth order moment achieves better or similar sensitivity and specificity than many well-known and accepted optical techniques used to differentiate melanoma and seborrheic keratosis.

This paper presents a novel input-parasitic compensation (IPC) technique for a nanopore-based complementary metal-oxide-semiconductor (CMOS) DNA detection sensor. A resistive-feedback transimpedance amplifier is typically adopted as the headstage of a DNA detection sensor to amplify the minute ionic currents generated from a nanopore and convert them to a readable voltage range for digitization. But, parasitic capacitances arising from the headstage input and the nanopore often cause headstage saturation during nanopore sensing, thereby resulting in significant DNA data loss. To compensate for the unwanted saturation, in this work, we propose an area-efficient and automated IPC technique, customized for a low-noise DNA detection sensor, fabricated using a 0.35- μm CMOS process; we demonstrated this prototype in a benchtop test using an α-hemolysin ( α-HL) protein nanopore.

A field facility has been developed to allow controlled studies of near surface CO2transport and detection technologies. The key component of the facility is a shallow horizontal, well slotted over 70m of its length and divided into seven zones via packers with mass flow control in each individual zone. The scale and fluxes were designed to address large scale CO2 storage projects and desired retention rates for those projects and those design parameters will be discussed. A wide variety of detectiontechniques were deployed by collaborators from Los Alamos National Lab, Lawrence Berkeley National Lab, the National Energy Technology Lab, Pacific Northwest National Lab, Lawrence Livermore National Lab and West Virginia University. Techniques included eddy covariance, soil gas measurements, hyperspectral imaging for plant stress detection, differential absorption LIDAR (both free space atmospheric and below surface soil gas), tracer studies, water sampling, stable isotope studies, and soil flux chambers. An overview of these results will be presented.

Skin cancer is the most common cancer in the Western world. In order to accurately detect the disease, especially malignant melanoma-the most fatal form of skin cancer-at an early stage when the prognosis is excellent, there is an urgent need to develop noninvasive early detection methods. We believe that polarization speckle patterns, defined as a spatial distribution of depolarization ratio of traditional speckle patterns, can be an important tool for skin cancer detection. To demonstrate our technique, we conduct a large in vivo clinical study of 214 skin lesions, and show that statistical moments of the polarization speckle pattern could differentiate different types of skin lesions, including three common types of skin cancers, malignant melanoma, squamous cell carcinoma, basal cell carcinoma, and two benign lesions, melanocytic nevus and seborrheic keratoses. In particular, the fourth order moment achieves better or similar sensitivity and specificity than many well-known and accepted optical techniques used to differentiate melanoma and seborrheic keratosis.

Pedicle screw fixation has achieved significant popularity amongst spinal surgeons for both single and multi-level spinal fusion. Misplacement and pedicle cortical violation occurs in over 20% of screw placement and can result in potential complications such as dysthesia, paraparesis or paraplegia. There have been many advances in techniques available for navigating through the pedicle; however, these techniques are not without drawbacks. A new electrical conductivity-measuring device, previously evaluated on the porcine model to detect the pedicle violation, was evaluated amongst nine European Hospitals to be used in conjunction with the methods currently used in that centre. This new device is based on two original principles; the device is integrated in the drilling or screwing tool. The technology allows real-time detection of perforation through two independent parameters, impedance variation and evoked muscle contractions. Data was collected twofold. Initially, the surgeon was given the device and a comparison was made between the devices ability to detect a breech and the surgeon's ability to detect one using his traditional methods of pedicle preparation. In the second module of the study, the surgeon was limited to using the electrical conductivity detection device as their sole guide to detect pedicle breaches. A comparison was made between the detection ability of the device and the other detection possibilities. Post-operative fine cut CT scanning was used to detect the pedicle breaches. Overall, the 11 trial surgeons performed a total of 521 pedicle drillings on 97 patients. Initially there were 147 drillings with 23 breaches detected. The detection rate of these breaches were 22/23 for the device compared to 10/23 by the surgeon. Over both parts of the study 64 breaches (12.3%) were confirmed on post-operative CT imaging. The electrical conductivity detection device detected 63 of the 64 breaches (98.4%). There was one false negative and four false

Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detectiontechniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

Full Text Available Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detectiontechniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS, cavity ringdown spectroscopy (CRDS, integrated cavity output spectroscopy (ICOS, cavity enhanced absorption spectroscopy (CEAS, cavity leak-out spectroscopy (CALOS, photoacoustic spectroscopy (PAS, quartz-enhanced photoacoustic spectroscopy (QEPAS, and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS. Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

Kepler planet candidates require both spectroscopic and imaging follow-up observations to rule out false positives and detect blended stars. Traditionally, spectroscopy and high-resolution imaging have probed different host star companion parameter spaces, the former detecting tight binaries and the latter detecting wider bound companions as well as chance background stars. In this paper, we examine a sample of 11 Kepler host stars with companions detected by two techniques—near-infrared adaptive optics and/or optical speckle interferometry imaging, and a new spectroscopic deblending method. We compare the companion effective temperatures (T{sub eff}) and flux ratios (F{sub B}/F{sub A}, where A is the primary and B is the companion) derived from each technique and find no cases where both companion parameters agree within 1σ errors. In 3/11 cases the companion T{sub eff} values agree within 1σ errors, and in 2/11 cases the companion F{sub B}/F{sub A} values agree within 1σ errors. Examining each Kepler system individually considering multiple avenues (isochrone mapping, contrast curves, probability of being bound), we suggest two cases for which the techniques most likely agree in their companion detections (detect the same companion star). Overall, our results support the advantage that the spectroscopic deblending technique has for finding very close-in companions (θ ≲ 0.″02–0.″05) that are not easily detectable with imaging. However, we also specifically show how high-contrast AO and speckle imaging observations detect companions at larger separations (θ ≥ 0.″02–0.″05) that are missed by the spectroscopic technique, provide additional information for characterizing the companion and its potential contamination (e.g., position angle, separation, magnitude differences), and cover a wider range of primary star effective temperatures. The investigation presented here illustrates the utility of combining the two techniques to reveal higher

With the explosive growth of network technologies, insider attacks have become a major concern to business operations that largely rely on computer networks. To better detect insider attacks that marginally manipulate network traffic over time, and to recover the system from attacks, in this paper we implement a temporal-based detection scheme using the sequential hypothesis testing technique. Two hypothetical states are considered: the null hypothesis that the collected information is from benign historical traffic and the alternative hypothesis that the network is under attack. The objective of such a detection scheme is to recognize the change within the shortest time by comparing the two defined hypotheses. In addition, once the attack is detected, a server migration-based system recovery scheme can be triggered to recover the system to the state prior to the attack. To understand mitigation of insider attacks, a multi-functional web display of the detection analysis was developed for real-time analytic. Experiments using real-world traffic traces evaluate the effectiveness of Detection System and Recovery (DeSyAR) scheme. The evaluation data validates the detection scheme based on sequential hypothesis testing and the server migration-based system recovery scheme can perform well in effectively detecting insider attacks and recovering the system under attack.

Osteoporosis (OP) a kind of bone disease, is very serious in particular for old persons, and may lead them to immobility and death. Early detection of the diseases is the first consideration for the patients to have more options to live a healthy life. The biomarkers or bonemarkers provide a promising challenge in clinical proteomics for early disease detection. In this paper, optical techniques such as Fourier Transform Infrared Spectroscopy (FTIR) and UV/Visible spectroscopy are employed to find the bone markers and emphasis has been given on noninvasive modalities for early detection of osteoporosis. Blood plasma samples procured from two groups, patients and healthy persons were tested. Both of the optical techniques revealed obvious differences in the spectra: between two groups, for example, increase in intensity for OP persons. New peaks were found at 1646, 1540, 1456 and 1077 cm-1 in FTIR spectra. Except 1588 cm-1, we showed decrease in spectral intensity of OP persons. In UV/Visible spectroscopy results, new peaks appeared in the OP patients spectra at the wavelength of 279 nm and 414 nm. These differences in the spectra of the two types samples, allow rapid and cost-effective discrimination of the potential patients with the optical techniques which were verified by the bone densitometer in the hospitals. The new and novel technique is quick, reliable and effective

Full Text Available Flea infestation is diagnosed after the detection of either adult parasites or flea faeces in the fur. The latter is generally tested with the wet blotting paper technique (WBPT. However, microscopical examination (MT of the coat brushing material is sometimes suggested as an alternative. This study aimed to compare the efficiency of the two techniques. In dogs, the entire body was hand-brushed and cats were combed. One half of the collected material was mounted in liquid paraffin on a glass slide and examined microscopically at low magnification. The second half was placed on a blotting paper and sterile water was added. After drying, reddish aureoles were counted. 255 animals (158 dogs and 97 cats were included. 119 (47% and 94 (37% samples were revealed to be positive with WBPT and MT, respectively. 13 cases (5% were positive with MT only and 38 cases (15% were positive with WBPT only. 81 cases (32% were positive and 123 (48% were negative with both techniques. More positive cases were detected by WBPT than MT (P<0.001. Amongst the 51 samples which were found positive with a sole technique, infestation was considered low in 43 cases and WBPT detected significantly more positive samples (31 than MT (12, P<0.01.

For the first time, to the best of our knowledge, a direct detection Brillouin optical time-domain reflectometry (BOTDR) is demonstrated for fast distributed dynamic strain sensing incorporating double-edge technique, time-division multiplexing technique and upconversion technique. In order to guarantee the robust stability of the system, the double-edge technique is implemented by using a convert single-channel FPI and a fiber-coupled upconversion single-photon detector, incorporating a time-division multiplexing method. The upconversion single-photon detector is adopted to upconvert the backscattering photons from 1548.1 nm to 863 nm, which is subsequently detected by a Silicon avalanche photodiode (Si-APD). In the experiment, dynamic strain disturbance up to 1.9 mε over 1.5 km of a polarization maintaining fiber is detected at a sampling rate of 30 Hz. An accuracy of ± 30 με and spatial resolution of 0.6 m are realized.

Full Text Available Wormhole attack is a challenging security threat to wireless sensor networks which results in disrupting most of the routing protocols as this attack can be triggered in different modes. In this paper, WRHT, a wormhole resistant hybrid technique, is proposed, which can detect the presence of wormhole attack in a more optimistic manner than earlier techniques. WRHT is based on the concept of watchdog and Delphi schemes and ensures that the wormhole will not be left untreated in the sensor network. WRHT makes use of the dual wormhole detection mechanism of calculating probability factor time delay probability and packet loss probability of the established path in order to find the value of wormhole presence probability. The nodes in the path are given different ranking and subsequently colors according to their behavior. The most striking feature of WRHT consists of its capacity to defend against almost all categories of wormhole attacks without depending on any required additional hardware such as global positioning system, timing information or synchronized clocks, and traditional cryptographic schemes demanding high computational needs. The experimental results clearly indicate that the proposed technique has significant improvement over the existing wormhole attack detectiontechniques.

Full Text Available Cervical cancer is the third most common form of cancer affecting women especially in third world countries. The predominant reason for such alarming rate of death is primarily due to lack of awareness and proper health care. As they say, prevention is better than cure, a better strategy has to be put in place to screen a large number of women so that an early diagnosis can help in saving their lives. One such strategy is to implement an automated system. For an automated system to function properly a proper set of features have to be extracted so that the cancer cell can be detected efficiently. In this paper we compare the performances of detecting a cancer cell using a single feature versus a combination feature set technique to see which will suit the automated system in terms of higher detection rate. For this each cell is segmented using multiscale morphological watershed segmentation technique and a series of features are extracted. This process is performed on 967 images and the data extracted is subjected to data mining techniques to determine which feature is best for which stage of cancer. The results thus obtained clearly show a higher percentage of success for combination feature set with 100% accurate detection rate.

Full Text Available Wireless Sensor Network based smart homes have the potential to meet the growing challenges of independent living of elderly people in smart homes. However, wellness detection of elderly people in smart homes is still a challenging research domain. Many researchers have proposed several techniques; however, majority of these techniques does not provide a comprehensive solution because complex activities cannot be determined easily and comprehensive wellness is difficult to diagnose. In this study’s critical review, it has been observed that strong association lies among the vital wellness determination parameters. In this paper, an association rules based model is proposed for the simple and complex (overlapped activities recognition and comprehensive wellness detection mechanism after analyzing existing techniques. It considers vital wellness detection parameters (temporal association of sub activity location and sub activity, time gaps between two adjacent activities, temporal association of inter and intra activities. Activity recognition and wellness detection will be performed on the basis of extracted temporal association rules and expert knowledgebase. Learning component is an important module of our proposed model to accommodate the changing trends in the frequent pattern behavior of an elderly person and recommend a caregiver/expert to adjust the expert knowledgebase according to the found abnormalities.

(Abbreviated) Kepler planet candidates require both spectroscopic and imaging follow-up observations to rule out false positives and detect blended stars. [...] In this paper, we examine a sample of 11 Kepler host stars with companions detected by two techniques -- near-infrared adaptive optics and/or optical speckle interferometry imaging, and a new spectroscopic deblending method. We compare the companion Teff and flux ratios (F_B/F_A, where A is the primary and B is the companion) derived from each technique, and find no cases where both companion parameters agree within 1sigma errors. In 3/11 cases the companion Teff values agree within 1sigma errors, and in 2/11 cases the companion F_B/F_A values agree within 1sigma errors. Examining each Kepler system individually considering multiple avenues (isochrone mapping, contrast curves, probability of being bound), we suggest two cases for which the techniques most likely agree in their companion detections (detect the same companion star). Overall, our results...

Full Text Available Cold-formed steel is uniform in quality, suitable for mass production, and light in weight. It is widely used for both structural and nonstructural members in buildings. When it is used in a bending structural member, damage such as local buckling is considered to be more important than general steel members in terms of failure mode. However, preceding studies on damage detection did not consider the failure characteristics of cold-formed beam members. Hence, this paper proposes a damage detectiontechnique that considers the failure mode of local buckling for a cold-formed beam member. The differences between the dynamic characteristics from vibration-based measurements and those from finite element model are set to error functions. The error functions are minimized by the optimization technique NSGA-II. In the damage detection, the location of local damage and the severity of damage are considered variables. The proposed technique was validated through a simulation of damage detection for a cold-formed steel beam structure example.

The conflict detection and resolution in collaborative design is a key issue to maintain multi-disciplinary design consistency. This paper proposes a new method for conflict detection and resolution based on constraint sat isfaction technique.The representation of design constraint, the interval arith metic of the constraint satisfaction problem CSP and the conflict resolution str ategy based on constraint relaxation and adjustment are introduced.A constr aint-satisfaction based conflict detection and resolution tool CSCDR is then de veloped. It can help collaborative designers to detect and resolve the confli cts in time in the early stage of the design process so that the unnecessary des ign iteration and repeated negotiation are avoided and the design efficiency is then much improved. A design case illustrates the effectiveness of CSCDR.

In this paper, the problem of automatic detection of tuberculosis lesion on 3D lung CT images is considered as a benchmark for testing out algorithms based on a modern concept of Deep Learning. For training and testing of the algorithms a domestic dataset of 338 3D CT scans of tuberculosis patients with manually labelled lesions was used. The algorithms which are based on using Deep Convolutional Networks were implemented and applied in three different ways including slice-wise lesion detection in 2D images using semantic segmentation, slice-wise lesion detection in 2D images using sliding window technique as well as straightforward detection of lesions via semantic segmentation in whole 3D CT scans. The algorithms demonstrate superior performance compared to algorithms based on conventional image analysis methods.

In this study the development of a methodology to detect illicit drugs and plastic explosives is described with the objective of being applied in the realm of public security. For this end, non-destructive assay with neutrons was used and the technique applied was the real time neutron radiography together with computerized tomography. The system is endowed with automatic responses based upon the application of an artificial intelligence technique. In previous tests using real samples, the system proved capable of identifying 97% of the inspected materials.

This paper describes three different techniques to detect proximity to voltage collapse (saddle-node bifurcation) in AC/DC systems; namely, direct (point of collapse) methods, continuation methods, and transient energy function methods. These techniques are compared among them, and also against the voltage sensitivity factor method using a 173 bus ac/dc test system. Various operational conditions are considered for this test system, so that transfer capabilities between areas and the effects of HVDC links can be analysed from the point of view of voltage collapse. 24 refs., 7 figs.

Full Text Available Cyclic nucleotides cAMP and cGMP are ubiquitous second messengers which regulate myriads of functions in virtually all eukaryotic cells. Their intracellular effects are often mediated via discrete subcellular signaling microdomains. In this review, we will discuss state-of-the-art techniques to measure cAMP and cGMP in biological samples with a particular focus on live cell imaging approaches, which allow their detection with high temporal and spatial resolution in living cells and tissues. Finally, we will describe how these techniques can be applied to the analysis of second messenger dynamics in subcellular signaling microdomains.

Full Text Available Data mining is a valuable tool in meteorological applications. Properly selected data mining techniques enable researchers to process and analyze massive amounts of data collected by satellites and other instruments. Large spatial-temporal datasets can be analyzed using different linear and nonlinear methods. The Self-Organizing Map (SOM is a promising tool for clustering and visualizing high dimensional data and mapping spatial-temporal datasets describing nonlinear phenomena. We present results of the application of the SOM technique in regions of interest within the European re-analysis data set. The possibility of detecting climate change signals through the visualization capability of SOM tools is examined.

of various origin, such as from loose covers, can be generated, complicating the assessment of the impact nature. In this work, detection of pitch issues is performed by analysing vibration impacts from main bearing accelerometers and applying environmental noise and speech recognition techniques......, distribution, and reproduction in any medium, provided the original author and source are credited. that the frequency band of maximum crest factor presents the best classification performance employing K-means clustering, which is an unsupervised clustering technique. The highest correct classification rate...

The present work focuses on a new variant of double pulse laser induced breakdown spectroscopy (DP-LIBS) called Townsend effect plasma spectroscopy (TEPS) for standoff applications. In the TEPS technique, the atomic and molecular emission lines are enhanced by a factor on the order of 25 to 300 times over LIBS, depending upon the emission lines observed. As a result, it is possible to extend the range of laser induced plasma techniques beyond LIBS and DP-LIBS for the detection of CBRNE materials at distances of several meters.

We report a detection of a faint near-Earth asteroid (NEA), which was done using our synthetic tracking technique and the CHIMERA instrument on the Palomar 200-inch telescope. This asteroid, with apparent magnitude of 23, was moving at 5.97 degrees per day and was detected at a signal-to-noise ratio (SNR) of 15 using 30 sec of data taken at a 16.7 Hz frame rate. The detection was confirmed by a second observation one hour later at the same SNR. The asteroid moved 7 arcseconds in sky over the 30 sec of integration time because of its high proper motion. The synthetic tracking using 16.7 Hz frames avoided the trailing loss suffered by conventional techniques relying on 30-sec exposure, which would degrade the surface brightness of image on CCD to an approximate magnitude of 25. This detection was a result of our 12-hour blind search conducted on the Palomar 200-inch telescope over two nights on September 11 and 12, 2013 scanning twice over six 5.0 deg x 0.043 deg fields. The fact that we detected only one NEA, ...

A major UK initiative, entitled 'Mapping the Underworld' aims to develop and prove the efficacy of a multi-sensor device for accurate remote buried utility service detection, location and, where possible, identification. One of the technologies to be incorporated in the device is low-frequency vibro-acoustics; the application of this technology for detecting buried infrastructure, in particular pipes, is currently being investigated. Here, a shear wave ground vibration technique for detecting buried pipes is described. For this technique, shear waves are generated at the ground surface, and the resulting ground surface vibrations measured. Time-extended signals are employed to generate the illuminating wave. Generalized cross-correlation functions between the measured ground velocities and a reference measurement adjacent to the excitation are calculated and summed using a stacking method to generate a cross-sectional image of the ground. To mitigate the effects of other potential sources of vibration in the vicinity, the excitation signal can be used as an additional reference when calculating the cross-correlation functions. Measurements have been made at two live test sites to detect a range of buried pipes. Successful detection of the pipes was achieved, with the use of the additional reference signal proving beneficial in the noisier of the two environments.

Objective:To compare analytical sensitivity and specificity of a newly described DNA amplification technique, LAMP and nested PCR assay targeting the RE and B1 genes for the detection ofToxoplasma gondii (T. gondii)DNA.Methods: The analytical sensitivity of LAMP and nested-PCR was obtained against10-fold serial dilutions ofT. gondii DNA ranging from 1 ng to 0.01 fg. DNA samples of other parasites and human chromosomal DNA were used to determine the specificity of molecular assays.Results:After testing LAMP and nested-PCR in duplicate, the detection limit of RE-LAMP, B1-LAMP, RE-nested PCR and B1-nested PCR assays was one fg, 100 fg, 1 pg and 10 pg ofT. gondii DNA respectively. All the LAMP assays and nested PCRs were 100% specific. The RE-LAMP assay revealed the most sensitivity for the detection ofT. gondii DNA.Conclusions:The obtained results demonstrate that the LAMP technique has a greater sensitivity for detection ofT. gondii. Furthermore, these findings indicate that primers based on the RE are more suitable than those based on the B1 gene. However, the B1-LAMP assay has potential as a diagnostic tool for detection ofT. gondii.

At present, some foreign researchers of meat industry developed countries are studying and applying nondestructive testing techniques to detect meat quality. In our country, these techniques are seldom used besides some techniques using electromagnetic properties. This paper introduces some modern nondestructive testing techniques for quality detection of meat, such as Supersonic wave,electronic Nose, electromagnetic method, near infrared, Raman Spectroscopy and computer vision technology. These techniques can meet the requirements of high speed and high accuracy on line detection of meat. Besides, further research and possible applications are also discussed.

The following paper is a summary of a number of techniques initiated under the AgRISTARS (Agriculture and Resources Inventory Surveys Through Aerospace Remote Sensing) project for the detection of soil degradation caused by water erosion and the identification of soil conservation practices for resource inventories. Discussed are methods to utilize a geographic information system to determine potential soil erosion through a USLE (Universal Soil Loss Equation) model; application of the Kauth-Thomas Transform to detect present erosional status; and the identification of conservation practices through visual interpretation and a variety of enhancement procedures applied to digital remotely sensed data.

Full Text Available A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detectiontechniques.

The worldwide need to replace 3He for neutron detection has triggered research and development on new technologies and methods. A promising one is based on commercial solid state silicon detectors coupled with thin neutron converter layers containing 6Li. After proving the feasibility of this technique, we characterized the behavior of such a detector with different converter layer thicknesses. In this paper we also disentangle other contributions to the overall spectrum shape observed with this kind of detector, proving that its detection efficiency can be made reasonably high and that the gamma/neutron discrimination capability is comparable to that of 3He tubes.

The worldwide need to replace 3He for the neutron detection has triggered R&D on new technologies and methods. A promising one is based on commercial solid state silicon detectors coupled with thin neutron converter layers containing 6Li. After proving the feasibility of this technique, we characterized the behavior of such a detector with different converter layer thicknesses. In this paper we also disentangle other contributions to the overall spectrum shape observed with this kind of detector, proving that its detection efficiency can be made reasonably high and that the gamma/neutron discrimination capability is comparable to the one of 3He tubes.

The worldwide need to replace {sup 3}He for neutron detection has triggered research and development on new technologies and methods. A promising one is based on commercial solid state silicon detectors coupled with thin neutron converter layers containing {sup 6}Li. After proving the feasibility of this technique, we characterized the behavior of such a detector with different converter layer thicknesses. In this paper we also disentangle other contributions to the overall spectrum shape observed with this kind of detector, proving that its detection efficiency can be made reasonably high and that the gamma/neutron discrimination capability is comparable to that of {sup 3}He tubes.

A field facility has been developed to allow controlled studies of near surface CO2 transport and detection technologies. The key component of the facility is a shallow, slotted horizontal well divided into six zones. The scale and fluxes were designed to address large scale CO2 storage projects and desired retention rates for those projects. A wide variety of detectiontechniques were deployed by collaborators from 6 national labs, 2 universities, EPRI, and the USGS. Additionally, modeling of CO2 transport and concentrations in the saturated soil and in the vadose zone was conducted. An overview of these results will be presented. ?? 2009 Elsevier Ltd. All rights reserved.

Photo-stimulated luminescence (PSL) technique was applied to detect irradiated black pepper (Piper nigrum), cinnamon (Cinnamomum verum) and turmeric (Curcuma longa) after dark storage for 1 day, 3 and 6 months. Using screening and calibrated PSL, all samples were correctly discriminated between non-irradiated and spices irradiated with doses 1, 5 and 10 kGy. The PSL photon counts (PCs) of irradiated spices increased with increasing dose, with turmeric showing highest sensitivity index to irradiation compared to black pepper and cinnamon. The differences in response are possibly attributed to the varying quantity and quality of silicate minerals present in each spice sample. PSL signals of all irradiated samples reduced after 3 and 6 months storage. The results of this study provide a useful database on the applicability of PSL technique for the detection of Malaysian irradiated spices.

Photo-stimulated luminescence (PSL) technique was applied to detect irradiated black pepper (Piper nigrum), cinnamon (Cinnamomum verum) and turmeric (Curcuma longa) after dark storage for 1 day, 3 and 6 months. Using screening and calibrated PSL, all samples were correctly discriminated between non-irradiated and spices irradiated with doses 1, 5 and 10 kGy. The PSL photon counts (PCs) of irradiated spices increased with increasing dose, with turmeric showing highest sensitivity index to irradiation compared to black pepper and cinnamon. The differences in response are possibly attributed to the varying quantity and quality of silicate minerals present in each spice sample. PSL signals of all irradiated samples reduced after 3 and 6 months storage. The results of this study provide a useful database on the applicability of PSL technique for the detection of Malaysian irradiated spices.

The wireless sensor network has become a hot research area due its wide range of application in military and civilian domain, but as it uses wireless media for communication these are easily prone to security attacks. There are number of attacks on wireless sensor networks like black hole attack, sink hole attack, Sybil attack, selective forwarding attacks etc. in this paper we will concentrate on selective forwarding attacks In selective forwarding attacks, malicious nodes behave like normal nodes and selectively drop packets. The selection of dropping nodes may be random. Identifying such attacks is very difficult and sometimes impossible. In this paper we have listed up some detectiontechniques, which have been proposed by different researcher in recent years, there we also have tabular representation of qualitative analysis of detectiontechniques

In this study, we provide an extensive comparison of different clutter suppression techniques that are proposed to enhance ground penetrating radar (GPR) data. Unlike previous studies, we directly measure and present the effect of these preprocessing algorithms on the detection performance. Basic linear prediction algorithm is selected as the detection scheme and it is applied to real GPR data after applying each of the available clutter suppression techniques. All methods are tested on an extensive data set of different surrogate mines and other objects that are commonly encountered under the ground. Among several algorithms, singular value decomposition based clutter suppression stands out with its superior performance and low computational cost, which makes it practical to use in real-time applications.

The recent paper by Simpson et al. [Remote Sens. Environ. 72 (2000) 191.] on failures to detect volcanic ash using the 'reverse' absorption technique provides a timely reminder of the danger that volcanic ash presents to aviation and the urgent need for some form of effective remote detection. The paper unfortunately suffers from a fundamental flaw in its methodology and numerous errors of fact and interpretation. For the moment, the 'reverse' absorption technique provides the best means for discriminating volcanic ash clouds from meteorological clouds. The purpose of our comment is not to defend any particular algorithm; rather, we point out some problems with Simpson et al.'s analysis and re-state the conditions under which the 'reverse' absorption algorithm is likely to succeed. ?? 2001 Elsevier Science Inc. All rights reserved.

Defects that occur during the manufacturing process or operation of a wind turbine blade have a great influence on its life and safety. Typically, defects such as delamination, pore, wrinkle and matrix crack are found in a blade. In this study, the detectability of the pores, a type of defect that frequently occur during manufacturing, was examined from the full field strain distribution determined with the image correlation technique. Pore defects were artificially introduced in four-ply laminated GFRP composites with 0 .deg/{+-}45 .deg fiber direction. The artificial pores were introduced in consideration of their size and location. Three different-sized pores with diameter of 1, 2 and 3 mm were located on the top and bottom surface and embedded. By applying static loads of 0-200 MPa, the strain distributions over the specimen with the pore defects were determined using image correlation technique. It was found the pores with diameter exceeding 2 mm can be detected in diameter.

Wavelet theory is efficient as an adequate tool for analyzing single epoch GPS deformation signal. Wavelet analysis technique on gross error detection and recovery is advanced. Criteria of wavelet function choosing and Mallat decomposition levels decision are discussed. An effective deformation signal extracting method is proposed, that is wavelet noise reduction technique considering gross error recovery, which combines wavelet multi-resolution gross error detection results. Time position recognizing of gross errors and their repairing performance are realized. In the experiment, compactly supported orthogonal wavelet with short support block is more efficient than the longer one when discerning gross errors, which can obtain more finely analyses. And the shape of discerned gross error of short support wavelet is simpler than that of the longer one. Meanwhile, the time scale is easier to identify.

A new and complex modification technique of glassy carbon electrode (GCE) with multi-walled carbon nanotubes (MWNTs) was developed. Firstly, MWNTs were electro-deposited on GCE at 1.70 V for 2 h. Secondly, by layer-by-layer (LBL) self-assembly technique, a functional membrane of {PDDA/MWNTs} n were fabricated by alternative immersion in 1% PDDA solution and 1 mg L -1 MWNTs dispersion either. As a result, the modified membrane with five {PDDA/MWNTs} bilayers have good sensitivity, stability, anti-fouling ability and catalytic activity for thiocholine (TCh) detection, the oxidation potential on the modified GCE was decreased almost by 50% while the peak current was increased almost by 100% compared with that on bare GCE. Meanwhile, it showed a low detection limit of less than 7.500 × 10 -7 mol L -1 TCh.

Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed.

The problem of detecting Sunyaev-Zel'dovich (SZ) clusters in multifrequency CMB observations is investigated using a number of filtering techniques. A multifilter approach is introduced, which optimizes the detection of SZ clusters on microwave maps. An alternative method is also investigated, in which maps at different frequencies are combined in an optimal manner so that existing filtering techniques can be applied to the single combined map. The SZ profiles are approximated by the circularly-symmetric template $\\tau (x) = [1 +(x/r_c)^2]^{-\\lambda}$, with $\\lambda \\simeq \\tfrac{1}{2}$ and $x\\equiv |\\vec{x}|$, where the core radius $r_c$ and the overall amplitude of the effect are not fixed a priori, but are determined from the data. The background emission is modelled by a homogeneous and isotropic random field, characterized by a cross-power spectrum $P_{\

This technique enhances the detection capability of the autonomous Real-Nose system from MIT to detect odorants and their concentrations in noisy and transient environments. The lowcost, portable system with low power consumption will operate at high speed and is suited for unmanned and remotely operated long-life applications. A deterministic mathematical model was developed to detect odorants and calculate their concentration in noisy environments. Real data from MIT's NanoNose was examined, from which a signal conditioning technique was proposed to enable robust odorant detection for the RealNose system. Its sensitivity can reach to sub-part-per-billion (sub-ppb). A Space Invariant Independent Component Analysis (SPICA) algorithm was developed to deal with non-linear mixing that is an over-complete case, and it is used as a preprocessing step to recover the original odorant sources for detection. This approach, combined with the Cascade Error Projection (CEP) Neural Network algorithm, was used to perform odorant identification. Signal conditioning is used to identify potential processing windows to enable robust detection for autonomous systems. So far, the software has been developed and evaluated with current data sets provided by the MIT team. However, continuous data streams are made available where even the occurrence of a new odorant is unannounced and needs to be noticed by the system autonomously before its unambiguous detection. The challenge for the software is to be able to separate the potential valid signal from the odorant and from the noisy transition region when the odorant is just introduced.

Field expeditions that simulate the operations of robotic planetary exploration missions at analog sites on Earth can help establish best practices and are therefore a positive contribution to the planetary exploration community. There are many sites in Iceland that possess heritage as planetary exploration analog locations and whose environmental extremes make them suitable for simulating scientific sampling and robotic operations. We conducted a planetary exploration analog mission at two recent lava fields in Iceland, Fimmvörðuháls (2010) and Eldfell (1973), using a specially developed field laboratory. We tested the utility of in-field site sampling down selection and tiered analysis operational capabilities with three life detection and characterization techniques: fluorescence microscopy (FM), adenine-triphosphate (ATP) bioluminescence assay, and quantitative polymerase chain reaction (qPCR) assay. The study made use of multiple cycles of sample collection at multiple distance scales and field laboratory analysis using the synchronous life-detectiontechniques to heuristically develop the continuing sampling and analysis strategy during the expedition. Here we report the operational lessons learned and provide brief summaries of scientific data. The full scientific data report will follow separately. We found that rapid in-field analysis to determine subsequent sampling decisions is operationally feasible, and that the chosen life detection and characterization techniques are suitable for a terrestrial life-detection field mission. In-field analysis enables the rapid obtainment of scientific data and thus facilitates the collection of the most scientifically relevant samples within a single field expedition, without the need for sample relocation to external laboratories. The operational lessons learned in this study could be applied to future terrestrial field expeditions employing other analytical techniques and to future robotic planetary exploration

of Stress and Pregnancy in Large Whales Shannon Atkinson 17101 Point Lena Loop Road Juneau, Alaska 99801 phone: (907) 796.5453 fax: (907...being nested under the overall goal of developing techniques that can be used for the detection of stress and pregnancy in large whales. 1) The...first objective is to develop and conduct analytical and preliminary biological validations of pregnancy and stress hormones for three large whales

To assess whether a recently developed indirect immunofluorescent stain using monoclonal antibodies was more sensitive in detecting Pneumocystis carinii than the combination of Giemsa and methenamine silver nitrate stains which has routinely been used in the laboratory, 88 lavage fluid specimens...... and 34 induced sputum specimens were examined. All specimens were stained by five techniques: immunofluorescence using a combination of three monoclonal antibodies (from the National Institutes of Health, USA), immunofluorescence using a single monoclonal antibody (from Dakopatts), Giemsa, methenamine...

Quality control is an important issue in the ceramic tile industry. On the other hand maintaining the rate of production with respect to time is also a major issue in ceramic tile manufacturing. Again, price of ceramic tiles also depends on purity of texture, accuracy of color, shape etc. Considering this criteria, an automated defect detection and classification technique has been proposed in this report that can have ensured the better quality of tiles in manufacturing process as well as pr...

We report here several experiences of interphase cytogenetics, using fluorescence in situ hybridization (FISH) technique, for the detection of chromosome aberrations. FISH, using alpha satellite specific probes of 18, X, Y chromosomes, was done in interphase nuclei from peripheral blood of patients with Edwards' syndrome, Klinefelter's syndrome and Turner's syndrome with healthy male and female controls, respectively. The distributions of fluorescent signals in 100 interphase nuclei were well...

Reliable and accurate health monitoring techniques can prevent catastrophic failures of structures. Conventional damage detection methods are based on visual or localized experimental methods and very often require prior information concerning the vicinity of the damage or defect. The structure must also be readily accessible for inspections. The techniques are also labor intensive. In comparison to these methods, health-monitoring techniques that are based on the structural dynamic response offers unique information on failure of structures. However, systematic relations between the experimental data and the defect are not available and frequently, the number of vibration modes needed for an accurate identification of defects is much higher than the number of modes that can be readily identified in the experiment. These motivated us to develop an experimental data based detection method with systematic relationships between the experimentally identified information and the analytical or mathematical model representing the defective structures. The developed technique use changes in vibrational curvature modes and natural frequencies. To avoid misinterpretation of the identified information, we also need to understand the effects of defects on the structural dynamic response prior to developing health-monitoring techniques. In this thesis work we focus on two type of defects in composite structures, namely delamination and edge notch like defect. Effects of nonlinearity due to the presence of defect and due to the axial stretching are studied for beams with delamination. Once defects are detected in a structure, next concern is determining the effects of the defects on the strength of the structure and its residual stiffness under dynamic loading. In this thesis, energy release rate due to dynamic loading in a delaminated structure is studied, which will be a foundation toward determining the residual strength of the structure.

Ground penetrating radar (GPR) is a commonly employed sensing modality for landmine detection. It has been successfully deployed in vehicular systems, and is also being integrated into handheld systems. Handheld mine detection systems are typically deployed in situations where either the terrain or mission renders a vehicular-based system less effective. Handheld systems are often more compact and maneuverable, but quality of the sensor data may also be more dependent on the operators experience with and technique in using the system. In particular, the sensor height with respect to the air-ground interface may be more variable than with a vehicular-based system. This variation in sensor height above the air-ground interface may have the potential to adversely affect mine detection performance with the GPR sensing modality. In this work, the effects of operator technique on handheld sensor data quality is investigated, and ground alignment is explored as a potential approach to reducing variability in the sensor data quality due to operator technique. Results for data measured with a standard GPR/EMI handheld sensor at a standardized test site are presented.

The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.

Full Text Available Due to the powerful image editing tools images are open to several manipulations; therefore, their authenticity is becoming questionable especially when images have influential power, for example, in a court of law, news reports, and insurance claims. Image forensic techniques determine the integrity of images by applying various high-tech mechanisms developed in the literature. In this paper, the images are analyzed for a particular type of forgery where a region of an image is copied and pasted onto the same image to create a duplication or to conceal some existing objects. To detect the copy-move forgery attack, images are first divided into overlapping square blocks and DCT components are adopted as the block representations. Due to the high dimensional nature of the feature space, Gaussian RBF kernel PCA is applied to achieve the reduced dimensional feature vector representation that also improved the efficiency during the feature matching. Extensive experiments are performed to evaluate the proposed method in comparison to state of the art. The experimental results reveal that the proposed technique precisely determines the copy-move forgery even when the images are contaminated with blurring, noise, and compression and can effectively detect multiple copy-move forgeries. Hence, the proposed technique provides a computationally efficient and reliable way of copy-move forgery detection that increases the credibility of images in evidence centered applications.

Web spam is a technique through which the irrelevant pages get higher rank than relevant pages in the search engine’s results. Spam pages are generally insufficient and inappropriate results for user. Many researchers are working in this area to detect the spam pages. However, there is no universal efficient technique developed so far which can detect all spam pages. This paper is an effort in that direction, where we propose a combined approach of content and link-based techniques to identify the spam pages. The content-based approach uses term density and Part of Speech (POS) ratio test and in the link-based approach, we explore the collaborative detection using personalized page ranking to classify the Web page as spam or non-spam. For experimental purpose, WEBSPAM-UK2006 dataset has been used. The results have been compared with some of the existing approaches. A good and promising F-measure of 75.2% demonstrates the applicability and efficiency of our approach.

In this paper we analyze the performance of time-reversal (TR) techniques in conjunction with various Ground Penetrating Radar (GPR) pre-processing methods aimed at improving detection of subsurface targets. TR techniques were first developed for ultrasound applications and, by exploiting the invariance of the wave equation under time reversal, can yield features such as superresolution and statistical stability. The TR method was examined here using both synthetic and actual GPR field data under four different pre-processing strategies on the raw data, namely: mean background removal, eigenvalue background removal, a sliding-window space-frequency technique, and a noise-robust spatial differentiator along the scan direction. Depending on the acquisition mode, it was possible to determine with good precision the position and depth of the studied targets as well as, in some cases, to differentiate the targets from nearby clutter such as localized geological anomalies. The proposed methodology has the potential...

In this paper, a technique based on image pyramid and Bayes rule for reducing noise effects in unsupervised change detection is proposed. By using Gaussian pyramid to process two multitemporal images respectively, two image pyramids are constructed. The difference pyramid images are obtained by point-by-point subtraction between the same level images of the two image pyramids. By resizing all difference pyramid images to the size of the original multitemporal image and then making product operator among them, a map being similar to the difference image is obtained. The difference image is generated by point-by-point subtraction between the two multitemporal images directly. At last, the Bayes rule is used to distinguish the changed pixels. Both synthetic and real data sets are used to evaluate the performance of the proposed technique. Experimental results show that the map from the proposed technique is more robust to noise than the difference image.

The purpose of the study was to describe a simple infrared photography technique to aid in the diagnosis and documentation of pupillary abnormalities. An unmodified 12-megapixel "point and shoot" digital camera was used to obtain binocular still photos and videos under different light conditions with near-infrared illuminating frames. The near-infrared light of 850 nm allows the capture of clear pupil images in both dim and bright light conditions. It also allows easy visualization of the pupil despite pigmented irides by augmenting the contrast between the iris and the pupil. The photos and videos obtained illustrated a variety of pupillary abnormalities using the aforementioned technique. This infrared-augmented photography technique supplements medical education, and aids in the more rapid detection, diagnosis, and documentation of a wide spectrum of pupillary abnormalities. Its portability and ease of use with minimal training complements the education of trainees and facilitates the establishment of difficult diagnoses.

BACKGROUND: Follow-up of patients with an initial negative prostate biopsy, but surrounding whom a suspicion of prostate cancer persists, is difficult. In addition, debate exists as to the optimal technique for repeat prostate biopsy. AIMS: To assess the cancer detection rate on repeat prostate biopsy. METHODS: We reviewed patients who underwent prostate biopsy in our department in 2005 who had >or=1 previous biopsy within the preceding 5 years. Cancer detection rate on repeat biopsy and the influence of the number of biopsy cores were recorded. RESULTS: Cancer detection rate on repeat biopsy was 15.4%, with approximately 60% detected on the first repeat biopsy, but approximately 10% not confirmed until the fourth repeat biopsy. Gleason score was similar regardless of the time of diagnosis (6.1-6.5). Mean interval between first biopsy and cancer diagnosis (range 18-55 months) depended on the number of repeat procedures. There was an association between the number of biopsy cores and cancer detection. CONCLUSIONS: This study supports the practice of increasing the number of cores taken on initial and first repeat biopsy to maximise prostate cancer detection and reduce the overall number of biopsies needed.

To satisfy the requirement of high speed, real-time and mass data storage etc. for RX anomaly detection of hyperspectral image data, the present paper proposes a solution of multi-DSP parallel processing system for hyperspectral image based on CPCI Express standard bus architecture. Hardware topological architecture of the system combines the tight coupling of four DSPs sharing data bus and memory unit with the interconnection of Link ports. On this hardware platform, by assigning parallel processing task for each DSP in consideration of the spectrum RX anomaly detection algorithm and the feature of 3D data in the spectral image, a 4DSP parallel processing technique which computes and solves the mean matrix and covariance matrix of the whole image by spatially partitioning the image is proposed. The experiment result shows that, in the case of equivalent detective effect, it can reach the time efficiency 4 times higher than single DSP process with the 4-DSP parallel processing technique of RX anomaly detection algorithm proposed by this paper, which makes a breakthrough in the constraints to the huge data image processing of DSP's internal storage capacity, meanwhile well meeting the demands of the spectral data in real-time processing.

Ultrasound based elasticity imaging techniques have been developed during the past decades. Some of these techniques are based on an internal radiation force stimulation in which a transient or dynamic radiation force is produced by using a single or dual-frequency sonication. In addition, sonication and data acquisition can be implemented using combined or separate transducers. In this simulation study of lesion detection using localized harmonic motion imaging (LHMI), we used a combined phased array designed for simultaneous thermal ablation and lesion detection. In the sonication mode, a focused single-frequency amplitude-modulated sonication is used to induce harmonic motion and in the tracking mode, some of the array elements are used for pulse-echo tracking of the induced displacements. The results showed that the size of the lesion affected the induced displacement around the sonication point. The displacement tracking simulations demonstrated that these changes in the displacement distributions can be detected using only a few of the array elements in the tracking mode but the exact size of the lesion can not be detected accurately. The simulations also showed that two lesions having the radius of 2.5mm can be distinguished if distance between these lesions is at least 2.5mm.

Full Text Available Landmine clearance is an ongoing problem that currently affects millions of people around the world. This study evaluates the effectiveness of ground penetrating radar (GPR in demining and unexploded ordnance detection using 2.3-GHz and 1-GHz high-frequency antennas. An automated detection tool based on machine learning techniques is also presented with the aim of automatically detecting underground explosive artifacts. A GPR survey was conducted on a designed scenario that included the most commonly buried items in historic battle fields, such as mines, projectiles and mortar grenades. The buried targets were identified using both frequencies, although the higher vertical resolution provided by the 2.3-GHz antenna allowed for better recognition of the reflection patterns. The targets were also detected automatically using machine learning techniques. Neural networks and logistic regression algorithms were shown to be able to discriminate between potential targets and clutter. The neural network had the most success, with accuracies ranging from 89% to 92% for the 1-GHz and 2.3-GHz antennas, respectively.

This paper addresses the fault diagnosis problem of uncertain systems in the context of Bond Graph modelling technique. The main objective is to enhance the fault detection step based on Interval valued Analytical Redundancy Relations (named I-ARR) in order to overcome the problems related to false alarms, missed alarms and robustness issues. These I-ARRs are a set of fault indicators that generate the interval bounds called thresholds. A fault is detected once the nominal residuals (point valued part of I-ARRs) exceed the thresholds. However, the existing fault detection method is limited to parametric faults and it presents various limitations with regards to estimation of measurement signal derivatives, to which I-ARRs are sensitive. The novelties and scientific interest of the proposed methodology are: (1) to improve the accuracy of the measurements derivatives estimation by using a dedicated sliding mode differentiator proposed in this work, (2) to suitably integrate the Fourier-Motzkin Elimination (FME) technique within the I-ARRs based diagnosis so that measurements faults can be detected successfully. The latter provides interval bounds over the derivatives which are included in the thresholds. The proposed methodology is studied under various scenarios (parametric and measurement faults) via simulations over a mechatronic torsion bar system.

Full Text Available Objectives: To describe the microbial profiles of peri-implant diseases and the main detection methods. Material and Methods: A literature search was performed in MEDLINE via PubMed database to identify studies on microbial composition of peri-implant surfaces in humans published in the last 5 years. Studies had to have clear implant status definition for health, peri-implant mucositis and/or peri-implantitis and specifically study microbial composition of the peri-implant sulcus. Results: A total of 194 studies were screened and 47 included. Peri-implant sites are reported to be different microbial ecosystems compared to periodontal sites. However, differences between periodontal and peri-implant health and disease are not consistent across all studies, possibly due to the bias introduced by the microbial detectiontechnique. New methods non species-oriented are being used to find ‘unexpected’ microbiota not previously described in these scenarios. Conclusions: Microbial profile of peri-implant diseases usually includes classic periodontopathogens. However, correlation between studies is difficult, particularly because of the use of different detection methods. New metagenomic techniques should be promoted for future studies to avoid detection bias.

ABSTRACT Objectives To describe the microbial profiles of peri-implant diseases and the main detection methods. Material and Methods A literature search was performed in MEDLINE via PubMed database to identify studies on microbial composition of peri-implant surfaces in humans published in the last 5 years. Studies had to have clear implant status definition for health, peri-implant mucositis and/or peri-implantitis and specifically study microbial composition of the peri-implant sulcus. Results A total of 194 studies were screened and 47 included. Peri-implant sites are reported to be different microbial ecosystems compared to periodontal sites. However, differences between periodontal and peri-implant health and disease are not consistent across all studies, possibly due to the bias introduced by the microbial detectiontechnique. New methods non species-oriented are being used to find ‘unexpected’ microbiota not previously described in these scenarios. Conclusions Microbial profile of peri-implant diseases usually includes classic periodontopathogens. However, correlation between studies is difficult, particularly because of the use of different detection methods. New metagenomic techniques should be promoted for future studies to avoid detection bias. PMID:27833735

Full Text Available In this study, the performance of stochastic optimization techniques in the finite element model updating approach was investigated for damage detection in a quarter-scale two-span reinforced concrete bridge system which was tested experimentally at the University of Nevada, Reno. The damage sequence in the structure was induced by a range of progressively increasing excitations in the transverse direction of the specimen. Intermediate non-destructive white noise excitations and response measurements were used for system identification and damage detection purposes. It is shown that, when evaluated together with the strain gauge measurements and visual inspection results, the applied finite element model updating algorithm on this complex nonlinear system could accurately detect, localize, and quantify the damage in the tested bridge columns throughout the different phases of the experiment.

In lentic water bodies as reservoirs occur eutrophication processes, originated mainly from human activities (i.e. agriculture, animal exploitation). This influx of nutrients in aquatic ecosystems could promote blooms of potentially toxic cyanobacteria. The purpose of this work is to detect the presence of cyanobacteria strains in water samples, using molecular techniques to help in preventive management of reservoirs dedicated to water purification. We used two molecular techniques to detect genes implied with the synthesis of hepatotoxic microcystins from potentially toxic cyanobacteria strains, and to evaluate the molecular diversity of cyanobacteria in water samples from two high-mountain reservoirs used for purification of drinking water for the metropolitan area of Medellin, Colombia. Between 2010-2011 collections of 12 water samples were taken and DNA extraction together with PCR and DGGE analyses where carried out. We amplified 22 sequences between 250-300bp of the genes mcyA and mcyE, and these sequences were related with several strains and cyanobacteria genera accessions from NCBI-GenBank databases. Moreover, sequence amplifications of the 16S small ribosomal RNA subunit - 16S rRNA- between 400-800bp were also performed in order to use them for the DGGE technique. The amplification products of DGGE were set in polyacrilamide gel with posterior denaturing electrophoresis, and the scanned images of the gel bands were analysed with the software GelCompar II. For Riogrande II and La Fe reservoirs we found 35 and 30 different DGGE bands, respectively, as a measurement of molecular diversity in these artificial ecosystems. Here, we demonstrated the utility of two molecular techniques for the detection of genes associated with toxicity and molecular diversity of cyanobacteria in reservoirs destined for drinking water in urban centers. We recommend strongly following with periodically molecular biology studies in these ecosystems combined with limnological and

A travelling thermal wave technique employing optical excitation and pyroelectric detection of thermal waves propagating along a material film/coating on a substrate is described. The method enables direct measurement of thermal diffusivity. The technique involves measurement of the phase lag undergone by an optically excited thermal wave as it propagates along the coating. The set up has been automated for convenient and fast data acquisition and analysis. The technique has been adapted to measurement of thermal diffusivity of a commercial paint sample coated on glass and copper substrates. It is found that thermal diffusivity of the coating is independent of the thermal conductivity of the substrate. Dependence of thermal diffusivity on coating thickness shows exponential increase, with value reaching a constant at a characteristic high thickness. Measurements have been carried out on a few other samples with wide variations in thermal diffusivity, and the results compared with available reports or results obtained following other techniques. Analyses of the results show that the technique allows measurement of thermal diffusivity of coatings and films with uncertainties better than ±2.5%.

Full Text Available Mobile ad hoc network (MANET is an autonomous self-configuring infrastructure-less wireless network. MANET is vulnerable to a lot of routing security threats due to unreliability of its nodes that are highly involved in the routing process. In this paper, a new technique is proposed to enhance the security of one of the most popular MANET routing protocols that is called Ad hoc on Demand Distance Vector (AODV with minimum routing overhead and high packet delivery ratio. The proposed technique intends to detect and remove black, gray, and cooperative black hole AODV attacks depending on a mobile backbone network constructed from randomly moving regular MANET nodes based on their trust value, location, and power. The backbone network monitors regular nodes as well as each other to periodically estimate monitoring trust values which represent the reliability of each node in the network. The drop in the monitoring trust value of any node is used as a clue to its malicious behavior. The backbone network also tries to bait the malicious nodes to reply to a request for a route to fake destination address. The proposed technique uses the control packets of the AODV to exchange its control information which highly reduces the overhead. The simulation results show that the performance of the proposed technique is more secure than AODV and the other recently introduced techniques.

Full Text Available Breast Cancer early detection using terminologies of image processing is suffered from the less accuracy performance in different automated medical tools. To improve the accuracy, still there are many research studies going on different phases such as segmentation, feature extraction, detection, and classification. The proposed framework is consisting of four main steps such as image preprocessing, image segmentation, feature extraction and finally classification. This paper presenting the hybrid and automated image processing based framework for breast cancer detection. For image preprocessing, both Laplacian and average filtering approach is used for smoothing and noise reduction if any. These operations are performed on 256 x 256 sized gray scale image. Output of preprocessing phase is used at efficient segmentation phase. Algorithm is separately designed for preprocessing step with goal of improving the accuracy. Segmentation method contributed for segmentation is nothing but the improved version of region growing technique. Thus breast image segmentation is done by using proposed modified region growing technique. The modified region growing technique overcoming the limitations of orientation as well as intensity. The next step we proposed is feature extraction, for this framework we have proposed to use combination of different types of features such as texture features, gradient features, 2D-DWT features with higher order statistics (HOS. Such hybrid feature set helps to improve the detection accuracy. For last phase, we proposed to use efficient feed forward neural network (FFNN. The comparative study between existing 2D-DWT feature extraction and proposed HOS-2D-DWT based feature extraction methods is proposed.

Digital watermarking has been proposed as the means for copyright protection of multimedia data. Many of existing watermarking schemes focused on the robust means to mark an image invisibly without really addressing the ends of these schemes. This paper first discusses some scenarios in which many current watermarking schemes fail to resolve the rightful ownership of an image. The key problems are then identified, and some crucial requirements for a valid invisible watermark detection are discussed. In particular, we show that, for the particular application of resolving rightful ownership using invisible watermarks, it might be crucial to require that the original image not be directly involved in the watermark detection process. A general framework for validly detecting the invisible watermarks is then proposed. Some requirements on the claimed signature/watermarks to be used for detection are discussed to prevent the existence of any counterfeit scheme. The optimal detection strategy within the framework is derived. We show the effectiveness of this technique based on some visual-model-based watermark encoding schemes.

The emergence of wireless sensor networks (WSNs) has motivated a paradigm shift in patient monitoring and disease control. Epilepsy management is one of the areas that could especially benefit from the use of WSN. By using miniaturized wireless electroencephalogram (EEG) sensors, it is possible to perform ambulatory EEG recording and real-time seizure detection outside clinical settings. One major consideration in using such a wireless EEG-based system is the stringent battery energy constraint at the sensor side. Different solutions to reduce the power consumption at this side are therefore highly desired. The conventional approach incurs a high power consumption, as it transmits the entire EEG signals wirelessly to an external data server (where seizure detection is carried out). This paper examines the use of data reduction techniques for reducing the amount of data that has to be transmitted and, thereby, reducing the required power consumption at the sensor side. Two data reduction approaches are examined: compressive sensing-based EEG compression and low-complexity feature extraction. Their performance is evaluated in terms of seizure detection effectiveness and power consumption. Experimental results show that by performing low-complexity feature extraction at the sensor side and transmitting only the features that are pertinent to seizure detection to the server, a considerable overall saving in power is achieved. The battery life of the system is increased by 14 times, while the same seizure detection rate as the conventional approach (95%) is maintained.

, or an improvement in contrast over conventional SAFT reconstructed images. This report documents our efforts in four fronts: 1) Comparative study between traditional SAFT and FBD SAFT for concrete specimen with and without Alkali-Silica Reaction (ASR) damage, 2) improvement of our Model-Based Iterative Reconstruction (MBIR) for thick reinforced concrete [5], 3) development of a universal framework for sharing, reconstruction, and visualization of ultrasound NDE datasets, and 4) application of machine learning techniques for automated detection of ASR inside concrete. Our comparative study between FBD and traditional SAFT reconstruction images shows a clear difference between images of ASR and non-ASR specimens. In particular, the left first harmonic shows an increased contrast and sensitivity to ASR damage. For MBIR, we show the superiority of model-based techniques over delay and sum techniques such as SAFT. Improvements include elimination of artifacts caused by direct arrival signals, and increased contrast and Signal to Noise Ratio. For the universal framework, we document a format for data storage based on the HDF5 file format, and also propose a modular Graphic User Interface (GUI) for easy customization of data conversion, reconstruction, and visualization routines. Finally, two techniques for ASR automated detection are presented. The first technique is based on an analysis of the frequency content using Hilbert Transform Indicator (HTI) and the second technique employees Artificial Neural Network (ANN) techniques for training and classification of ultrasound data as ASR or non-ASR damaged classes. The ANN technique shows great potential with classification accuracy above 95%. These approaches are extensible to the detection of additional reinforced, thick concrete defects and damage.

Full Text Available Breast cancer is the most common form of cancer in women. An intelligent computer-aided diagnosis system can be very helpful for radiologist in detecting and diagnosing micro calcifications patterns earlier and faster than typical screening programs. In this paper, we present a system based on gabor filter based enhancement technique and feature extraction techniques using texture based segmentation and SOM(Self Organization Map which is a form of Artificial Neural Network(ANN used to analyze the texture features extracted. SOM determines which texture feature has the ability to classify benign, malignant and normal cases. Watershed segmentation technique is used to classify cancerous region from the non cancerous region. We have investigated and analyzed a number of feature extraction techniques and found that a combination of ten features, such as Cor-relation, Cluster Prominence, Energy, Entropy, Homogeneity, Difference variance, Difference Entropy, Information Measure, and Normalized are calculated. These features gives the distribution of tonality information and was found to be the best combination to distinguish a benign micro calcification pattern from one that is malignant and normal. The system was developed on a Windows platform. It is an easy to use intelligent system that gives the user options to diagnose, detect, enlarge, zoom, and measure distances of areas in digital mammograms. Further Using Linear Filtering Technique and the Texture Features as Mask are convolved with the segmented image .The tumor is detected using the above method and using watershed segmentation, a fair segmentation is obtained The artificial neural network with unsupervised learning together with texture based approach leads to the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 121 records acquired from the breast cancer patients at the MIAS database. The results revealed that the accuracies of texture

Full Text Available Adrenocortical autoantibodies (ACA, present in 60-80% of patients with idiopathic Addison's disease, are conventionally detected by indirect immunofluorescence (IIF on frozen sections of adrenal glands. The large-scale use of IIF is limited in part by the need for a fluorescence microscope and the fact that histological sections cannot be stored for long periods of time. To circumvent these restrictions we developed a novel peroxidase-labelled protein A (PLPA technique for the detection of ACA in patients with Addison's disease and compared the results with those obtained with the classical IIF assay. We studied serum samples from 90 healthy control subjects and 22 patients with Addison's disease, who had been clinically classified into two groups: idiopathic (N = 13 and granulomatous (N = 9. ACA-PLPA were detected in 10/22 (45% patients: 9/13 (69% with the idiopathic form and 1/9 (11% with the granulomatous form, whereas ACA-IIF were detected in 11/22 patients (50%: 10/13 (77% with the idiopathic form and 1/9 (11% with the granulomatous form. Twelve of the 13 idiopathic addisonians (92% were positive for either ACA-PLPA or ACA-IIF, but only 7 were positive by both methods. In contrast, none of 90 healthy subjects was found to be positive for ACA. Thus, our study shows that the PLPA-based technique is useful, has technical advantages over the IIF method (by not requiring the use of a fluorescence microscope and by permitting section storage for long periods of time. However, since it is only 60% concordant with the ACA-IIF method, it should be considered complementary instead of an alternative method to IIF for the detection of ACA in human sera.

There appears to be a limited but growing body of research on the sequential analysis/treatment of multiple types of evidence. The development of an integrated forensic approach is necessary to maximise evidence recovery and to ensure that a particular treatment is not detrimental to other types of evidence. This study aims to assess the effect of latent and blood mark enhancement techniques (e.g. fluorescence, ninhydrin, acid violet 17, black iron-oxide powder suspension) on the subsequent detection of saliva. Saliva detection was performed by means of a presumptive test (Phadebas®) in addition to analysis by a rapid stain identification (RSID) kit test and confirmatory DNA testing. Additional variables included a saliva depletion series and a number of different substrates with varying porosities as well as different ageing periods. Examination and photography under white light and fluorescence was carried out prior to and after chemical enhancement. All enhancement techniques (except Bluestar® Forensic Magnum luminol) employed in this study resulted in an improved visualisation of the saliva stains, although the inherent fluorescence of saliva was sometimes blocked after chemical treatment. The use of protein stains was, in general, detrimental to the detection of saliva. Positive results were less pronounced after the use of black iron-oxide powder suspension, cyanoacrylate fuming followed by BY40 and ninhydrin when compared to the respective positive controls. The application of Bluestar® Forensic Magnum luminol and black magnetic powder proved to be the least detrimental, with no significant difference between the test results and the positive controls. The use of non-destructive fluorescence examination provided good visualisation; however, only the first few marks in the depletion were observed. Of the samples selected for DNA analysis only depletion 1 samples contained sufficient DNA quantity for further processing using standard methodology. The 28-day

Cow's milk proteins (CMPs) are the best characterized food allergens. The aim of this study was to investigate cow's milk allergens in human colostrum of term and preterm newborns' mothers, and other minor protein components by proteomics techniques, more sensitive than other techniques used in the past. Sixty-two term and 11 preterm colostrum samples were collected, subjected to a treatment able to increase the concentration of the most diluted proteins and simultaneously to reduce the concentration of the proteins present at high concentration (Proteominer Treatment), and subsequently subjected to the steps of proteomic techniques. The most relevant finding in this study was the detection of the intact bovine alpha-S1-casein in human colostrum, then bovine alpha-1-casein could be considered the cow's milk allergen that is readily secreted in human milk and could be a cause of sensitization to cow's milk in exclusively breastfed predisposed infants. Another interesting result was the detection, at very low concentrations, of proteins previously not described in human milk (galectin-7, the different isoforms of the 14-3-3 protein and the serum amyloid P-component), probably involved in the regulation of the normal cell growth, in the pro-apoptotic function and in the regulation of tissue homeostasis. Further investigations are needed to understand if these families of proteins have specific biological activity in human milk.

Full Text Available The significant feature of detecting the motion image objects in this study it try identify the shark fish videos by removing the Background of the image. The main method involved in the detecting from the background is the foreground detection of the image. There are many techniques which usually ignore the fact that the background images consist of different image objects whose conditions may mostly change occur. In this study, a motion picture identification procedure is proposed for real time motion video frames by comparing the three key classes of methods for motion detection primarily the Background Removal (Subtraction followed by the Temporal distinguishing (differencing and Optical Flow method. Structured hierarchical background procedure is proposed based on segmenting background images objects. It mainly divided the background images divided into several parts (regions by the Support Vector Machine (SVM followed by a structured hierarchical model is built with the region procedure and pixel model procedure. In the region model method, the image object is extracted from the histogram of specific parts which is same to the kind of a Gaussian-combination model. In the pixel model procedure, it is been demonstrated by histograms, picture, which shows gradients sample of pixels in each parts based on the concurrent occurrence of object variations. In this study, it suggests Silhouette detection procedure and it is used. The experimental result are counter validated with a video database to illustrate its efficiencies, which is involved, from static to dynamic scenes by analyzing it with some distinguished motion detection methods chiefly Temporal differencing method followed by Optical Flow method and based on the outputs a motion detection procedure for real time video frames can be created which is cost effective, it shows good rate of accuracy, which is less rate of reliability in simple, less of complexity and well adapted to several

The western blotting technique for transfer and detection of proteins, named following the discovery of southern and northern blotting for DNA- and RNA-blotting, respectively, has traditionally relied on the use of X-ray films to capture chemiluminescence. Recent advancements use super-cooled charge coupled devices (CCD) cameras to capture both chemiluminescence and fluorescence images, which exhibit a greater dynamic range compared to traditional X-ray film. Chemiluminescence detected by a CCD camera records photons and displays an image based on the amount of light generated as a result of a dynamic chemical reaction. Fluorescent detection with a CCD camera, on the other hand, is measured in a static state. Despite this advantage, researchers continue to widely use chemiluminescent detection methods due to the generally poor performance of fluorophores in the visible spectrum. Infrared imaging systems offer a solution to the dynamic reactions of chemiluminescence and the poor performance of fluorophores detected in the visible spectrum, by imaging fluorophores in the infrared spectrum. Infrared imaging is static, has a wide linear range, high sensitivity, and reduced autofluorescence and light scatter. A distinct advantage of infrared imaging is the ability to detect two target proteins simultaneously on the same blot which increases accuracy of quantification and comparison, while minimizing the need for stripping and reprobing. Here, we compare the methodology for chemiluminescent (UVP BioChemi) and infrared (UVP Odyssey) detection of salivary total and phosphorylated fetuin-A, a multifunctional protein associated with cardio-metabolic risk, and discuss the advantages and disadvantages of these methodologies.

Full Text Available Development of a miniaturized biosensor system that can be used for rapid detection and counting of microorganisms in food or water samples is described. The developed microsystem employs a highly sensitive impedimetric array of biosensors to monitor the growth of bacterial colonies that are dispersed across an agar growth medium. To use the system, a sample containing the bacteria is cultured above the agar layer. Using a multiplexing network, the electrical properties of the medium at different locations are continuously measured, recorded, and compared against a baseline signal. Variations of signals from different biosensors are used to reveal the presence of bacteria in the sample, as well as the locations of bacterial colonies across the biochip. This technique forms the basis for a label-free bacterial detection for rapid analysis of food samples, reducing the detection time by at least a factor of four compared to the current required incubation times of 24 to 72 hours for plate count techniques. The developed microsystem has the potential for miniaturization to a stage where it could be deployed for rapid analysis of food samples at commercial scale at laboratories, food processing facilities, and retailers.

This paper deals with improved ECG signal analysis using Wavelet Transform Techniques and employing subsequent modified feature extraction for Arrhythmia detection based on Neuro-Fuzzy technique. This improvement is based on suitable choice of features in evaluating and predicting life threatening Ventricular Arrhythmia . Analyzing electrocardiographic signals (ECG) includes not only inspection of P, QRS and T waves, but also the causal relations they have and the temporal sequences they build within long observation periods. Wavelet-transform is used for effective feature extraction and Adaptive Neuro-Fuzzy Inference System (ANFIS) is considered for the classifier model. In a first step, QRS complexes are detected. Then, each QRS is delineated by detecting and identifying the peaks of the individual waves, as well as the complex onset and end. Finally, the determination of P and T wave peaks, onsets and ends is performed. We evaluated the algorithm on several manually annotated databases, such as MIT-BIH Arrhythmia and CSE databases, developed for validation purposes. Features based on the ECG waveform shape and heart beat intervals are used as inputs to the classifiers. The performance of the ANFIS model is evaluated in terms of training performance and classification accuracies and the results confirmed that the proposed ANFIS model has potential in classifying the ECG signals. Cross validation is used to measure the classifier performance. A testing classification accuracy of 95.13% is achieved which is a significant improvement.

The detection of landmines by using available technologies is a time consuming, expensive and extremely dangerous job, so that there is a need for a technological breakthrough in this field. Atomic and nuclear physics based sensors might offer new possibilities in de-mining. Among the available nuclear techniques, the neutron backscattering technique (NBT), based on the detection of the produced thermal neutrons, is thought to be the most promising for field applications. We discuss here two limitations of NBT, being related to the soil moisture. First, the critical value of the soil moisture, reached when the density of the hydrogen atoms in the landmine is equal to that in the background soil, defines a condition for which the detection is not possible. Critical values are small for some of the landmine types, thus suggesting the application of the method to arid countries, where the soil moisture is lower than 10%. Furthermore, small-scale variations of the soil moisture content, experimentally determined for different soil types, are found to be capable of generating false positive readings. To avoid this problem, the integration of the NBT with a second sensor, as the metal detector, is proposed.

Full Text Available Introduction: Diagnosing the necessity of cavity preparation and restoration in demineralized proximal areas is always considered as a challenge in restorative treatment planning. The purpose of this study was to assess the performance of the laser fluorescence (LF technique in detection of proximal cavities.Materials & Methods: In this clinical trial, 44 proximal surfaces in 38 dental students were evaluated. The selected patients had radiolucent proximal lesions restricted to inner half of enamel or outer third of dentine in bitewing radiographs (BW. DIAGNOdent pen (LF pen device was used to determine the presence or absence of caries cavities in suspected proximal surfaces. Orthodontic elastic separators were then placed in the contact areas to provide enough space for direct visual and tactile examination. The sensitivity, specificity and accuracy of the laser fluorescence technique were calculated versus the reference standard. The ROC curve was drawn and the best cut-off to determine the presence or absence of proximal cavities was determined.Results: Using DIAGNOdent pen, the optimal cut-off for detecting proximal cavities was 18. The sensitivity, specificity and accuracy of DIAGNOdent pen for diagnosing proximal caries cavities were 100 per cent, 97.3 per cent and 97.7 per cent, respectively. Conclusion: Due to the high diagnostic accuracy of DIAGNOdent pen in detecting proximal caries cavities, it can be used as a valuable supplement in restorative treatment planning.

Adding benzoyl peroxide (BPO) into wheat flour was prohibited by the relevant government departments since May 1, 2011. And it is of great importance to detect BPO additive amount in wheat flour quickly and accurately. Part of BPO which was added into wheat flour will be deoxidized into benzoic acid, and this make it complex to detect the original BPO additive amount. The objective of the present research is to investigate the potential of NIR diffuse reflectance spectroscopy as a way for measurement of BPO original adding amount in wheat flour. A total of 133 wheat flour samples were prepared by adding different content of BPO into pure wheat flour. Spectra data were obtained by NIR spectrometer and then denoised by wavelet transform. Ninety seven samples were taken as calibration set and other 36 samples as prediction set. Partial least squares regression (PLSR) was applied to establish the calibration model between BPO original adding contents and the spectra data. The determination coefficient of model for the calibration set is 0.8901, and root mean squared error of calibration (RMSEC) is 40.85 mg x kg(-1). The determination coefficient for the prediction set is 0.8865, and root mean squared error of prediction (RMSEP) is 44.69 mg x kg(-1). The result indicates that it is feasible to detect the BPO adding contents in wheat flour by NIR diffuse reflectance spectroscopy technique and this technique has the potential to measure some other additives in food.

Full Text Available In recent technology the popularity and demand of image processing is increasing due to its immense number of application in various fields. Most of these are related to biometric science like face recognitions, fingerprint recognition, iris scan, and speech recognition. Among them face detection is a very powerful tool for video surveillance, human computer interface, face recognition, and image database management. There are a different number of works on this subject. Face recognition is a rapidly evolving technology, which has been widely used in forensics such as criminal identification, secured access, and prison security. In this paper we had gone through different survey and technical papers of this field and list out the different techniques like Linear discriminant analysis, Viola and Jones classification and adaboost learning curvature analysis and discuss about their advantages and disadvantages also describe some of the detection and recognition algorithms, mention some application domain along with different challenges in this field. . We had proposed a classification of detectiontechniques and discuss all the recognition methods also.

Geosmin and 2-methylisoborneol are secondary metabolites expressed by a variety of organisms that are responsible for off-flavors in public water supplies, aquaculture, and a host of other important products. Hence, there is continuing research into the causes for their expression and methods to mitigate it, which require sensitive and accurate detection methods. In recent years, several new techniques for collecting and concentrating volatile and semi-volatile compounds have been automated and commercialized, making them available for use in most laboratories. In this study, we compared solid-phase microextraction (SPME) and membrane-assisted solvent extraction (MASE) for the detection of 2-methylisoborneol and geosmin in aqueous samples. SPME is the most sensitive of these techniques with a limit of detection of 25 parts-per-trillion for 2-methylisoborneol and 10 parts-per-trillion for geosmin but with a large relative standard deviation. MASE is less sensitive, but provides a greater level of precision, as well as the ability for multiple injections from the same sample.

Full Text Available The different FEC techniques like convolution code, RS code and turbo code are used to improve the performance of communication system. In this paper, we study the performance of the MAP, Log-MAP, Max-Log-MAP and APP decoding algorithms for turbo codes, in terms of the a priori information, a posteriori information, extrinsic information and channel reliability. We also analyze how important an accurate estimate of channel reliability factor is to the good performances of the iterative turbo decoder. The simulations are made for parallel concatenation of two recursive systematic convolution codes with a block interleaver at the transmitter, AWGN channel and iterative decoding with different algorithms at the receiver side. The comparison of these detectiontechniques in term of BER performance is discussed in result section.

Digital image correlation (DIC) is a technique developed for monitoring surface deformation/displacement of an object under loading conditions. This method is further refined to make it capable of handling discontinuities on the surface of the sample. A damage zone is referred to a surface area fractured and opened in due course of loading. In this study, an algorithm is presented to automatically detect multiple damage zones in deformed image. The algorithm identifies the pixels located inside these zones and eliminate them from FEM-DIC processes. The proposed algorithm is successfully implemented on several damaged samples to estimate displacement fields of an object under loading conditions. This study shows that displacement fields represent the damage conditions reasonably well as compared to regular FEM-DIC technique without considering the damage zones.

This paper presents a feasible method for rapid detection of the interphase nuclei of uncultured amniocytes for chromosomes 18 by using our modified primed in situ labeling (PRINS) technique. A total of 262 independent, uncultured amniotic fluid samples were analysed in a blind fashion before the karyotype was available. In addition, 62 samples were examined by fluorescence in situ hybridization (FISH) for comparison. In more than 95% of the samples PRINS reactions with primer 18cen were successfully induced. Two samples were properly identified and correctly scored as trisomic 18. PRINS reaction could be performed automatically in less than one hour with a programmable thermocycler. Our studies showed that the PRINS technique is simple, rapid and cost-effective. It is as sensitive and specific as FISH; can enhance the accuracy of standard cytogenetic analysis; and allows identification of chromosomes 18 aneuploidies in uncultured amniocytes in significantly less time.

Full Text Available This paper is based on the research on Human Brain Tumor which uses the MRI imaging technique to capture the image. In this proposed work Brain Tumor area is calculated to define the Stage or level of seriousness of the tumor. Image Processing techniques are used for the brain tumor area calculation and Neural Network algorithms for the tumor position calculation. Also in the further advancement the classification of the tumor based on few parameters is also expected. Proposed work is divided in to following Modules: Module 1: Image Pre-Processing Module 2: Feature Extraction, Segmentation using K-Means Algorithm and Fuzzy C-Means Algorithm Module 3: Tumor Area calculation & Stage detection Module 4: Classification and position calculation of tumor using Neural Network

The use of the gas-filled magnet technique for the detection of intermediate mass (A{approx} 20-40) recoil nuclei produced in (p,{alpha}) reactions in inverse kinematics has been investigated. Through a series of calibration measurements with {sup 27}Al, {sup 28,29}Si and {sup 33}S beams the optimum parameterization for calculating the average charge-state distribution in a gas-filled magnet has been determined. By measuring the magnetic rigidity, the time-of-flight and the differential energy loss of the particles at the focal plane of a gas-filled Enge Split Pole spectrograph it was possible to separate and identify the (p,{alpha}) reaction products from elastically scattered particles at very small scattering angles. This technique was then tested by measuring the p({sup 33}S,{sup 30}P){alpha} and p({sup 37}K,{sup 34}Cl){alpha} reactions.

Having motivation to learn is a successful requirement in a learning process, and needs to be maintained properly. This study aims to measure learning motivation, especially in the process of electronic learning (e-learning). Here, data mining approach was chosen as a research method. For the testing process, the accuracy comparative study on the different testing techniques was conducted, involving Cross Validation and Percentage Split. The best accuracy was generated by J48 algorithm with a percentage split technique reaching at 92.19 %. This study provided an overview on how to detect the presence of learning motivation in the context of e-learning. It is expected to be good contribution for education, and to warn the teachers for whom they have to provide motivation.

Full Text Available Abstract The aim of this study was to investigate occurrence of Toxoplasma gondii in sheep slaughtered in the state of Alagoas, Brazil, by means of different diagnosis techniques. Serum samples and tissues from 100 slaughtered sheep were used. To detect antibodies, the indirect immunofluorescence antibody test (IFAT was used, and tissues from seropositive animals (cut-off ≥1:64 were submitted to Polymerase Chain Reaction (PCR and immunohistochemistry (IHC. To assess the concordance between the direct techniques, the kappa test was used. In the IFAT, it was observed that 14% (14/100 of the ovine samples were serum-positive. In the PCR, 21.43% (3/14 of the animals were positive and in IHC, it was observed that 7.14% (1/14 were positively stained for T. gondii in cerebral tissue. Histopathologically, the predominant finding was the presence of mononuclear cell infiltrate in the heart and a perivascular cuff in the cerebrum and cerebellum. The concordance between the direct diagnosis techniques was moderate (k=0.44. Thus, it is important to use different direct techniques in diagnosing toxoplasmosis in naturally infected sheep.

A robotic system to automate the detection, location, and quantification of gear noise using acoustic intensity measurement techniques has been successfully developed. Major system components fabricated under this grant include an instrumentation robot arm, a robot digital control unit and system software. A commercial, desktop computer, spectrum analyzer and two microphone probe complete the equipment required for the Robotic Acoustic Intensity Measurement System (RAIMS). Large-scale acoustic studies of gear noise in helicopter transmissions cannot be performed accurately and reliably using presently available instrumentation and techniques. Operator safety is a major concern in certain gear noise studies due to the operating environment. The man-hours needed to document a noise field in situ is another shortcoming of present techniques. RAIMS was designed to reduce the labor and hazard in collecting data and to improve the accuracy and repeatability of characterizing the acoustic field by automating the measurement process. Using RAIMS a system operator can remotely control the instrumentation robot to scan surface areas and volumes generating acoustic intensity information using the two microphone technique. Acoustic intensity studies requiring hours of scan time can be performed automatically without operator assistance. During a scan sequence, the acoustic intensity probe is positioned by the robot and acoustic intensity data is collected, processed, and stored.

Dynamic optimization relies on runtime profile information to improve the performance of program execution.Traditional profiling techniques incur significant overhead and are not suitable for dynamic optimization. In this paper,a new profiling technique is proposed, that incorporates the strength of both software and hardware to achieve near-zero overhead profiling. The compiler passes profiling requests as a few bits of information in branch instructions to the hardware,and the processor executes profiling operations asynchronously in available free slots or on dedicated hardware. The compiler instrumentation of this technique is implemented using an Itanium research compiler. The result shows that the accurate block profiling incurs very little overhead to the user program in terms of the program scheduling cycles. For example,the average overhead is 0.6% for the SPECint95 benchmarks. The hardware support required for the new profiling is practical. The technique is extended to collect edge profiles for continuous phase transition detection. It is believed that the hardware-software collaborative scheme will enable many profile-driven dynamic optimizations for EPIC processors such as the Itanium processors.

For many real data, long term observation consists of different processes that coexist or occur one after the other. Those processes very often exhibit different statistical properties and thus before the further analysis the observed data should be segmented. This problem one can find in different applications and therefore new segmentation techniques have been appeared in the literature during last years. In this paper we propose a new method of time series segmentation, i.e. extraction from the analysed vector of observations homogeneous parts with similar behaviour. This method is based on the absolute deviation about the median of the signal and is an extension of the previously proposed techniques also based on the simple statistics. In this paper we introduce the method of structural break point detection which is based on the Adaptive Regression Splines technique, one of the form of regression analysis. Moreover we propose also the statistical test which allows testing hypothesis of behaviour related to different regimes. First, the methodology we apply to the simulated signals with different distributions in order to show the effectiveness of the new technique. Next, in the application part we analyse the real data set that represents the vibration signal from a heavy duty crusher used in a mineral processing plant.

Photometric measurements are prone to systematic errors presenting a challenge to low-amplitude variability detection. In search for a general-purpose variability detectiontechnique able to recover a broad range of variability types including currently unknown ones, we test 18 statistical characteristics quantifying scatter and/or correlation between brightness measurements. We compare their performance in identifying variable objects in seven time-series datasets obtained with telescopes ranging in size from a telephoto lens to 1m-class and probing variability on timescales from minutes to decades. The test datasets together include lightcurves of 127539 objects, among them 1251 variable stars of various types and represent a range of observing conditions often found in ground-based variability surveys. The real data are complemented by simulations. We propose a combination of two indices that together recover a broad range of variability types from photometric data characterized by a wide variety of sampli...

In order to overcome the inconvenience of manual bubble counting, a bubble counter based on photoelectric technique aiming for automatically detecting and measuring minute gas leakage of cryogenic valves is proposed. Experiments have been conducted on a self-built apparatus, testing the performance with different gas inlet strategies (bottom gas-inlet strategy and side gas-inlet strategy) and the influence of gas pipe length (0, 1, 2, 4, 6, 8, 10 m) and leakage rate (around 10, 20, 30, 40 bubbles/min) on first bubble time and bubble rate. A buffer of 110 cm3 is inserted between leakage source and gas pipe to simulate the downstream cavum adjacent to the valve clack. Based on analyzing the experimental data, experiential parameters have also been summarized to guide leakage detection and measurement for engineering applications. A practical system has already been successfully applied in a cryogenic testing apparatus for cryogenic valves.

Full Text Available In this paper an attempt has been carried out mapping and analysis the landuse and landcover change detection in Lalgudi block of Tiruchirappalli district using remote sensing and GIS techniques. The total area of the study area is 272.2 sq.km. It is located in the central part of Tamil Nadu. Landuse and Landcover change detection maps were generated and classified into agriculture land, built-up land, fallow land, natural vegetation, river sand, water bodies, and scrub without scrub land for the year 1990, 2000 and 2010 based on NRSA classification. Each landuse and landcover has been changed positively and negatively for the three decades, especially agriculture land, sandy area, natural vegetation and fallow land, which is about 19.62%, 6.56%, 13.16% and 14.91 percentages respectively.

Carbon Fiber Reinforced Polymer (CFRP) structures can be easily bonded via adhesive assembly procedures but their cleanliness is of fundamental importance to ensure the strength of the adhesive bonding. Actually, surface contamination by several aeronautics fluids eventually results in weak or kissing bonds. The goal of our research work is to investigate solid state chemical sensors and artificial olfaction techniques (AO) for the detection of CFRP surface contamination by aeronautic fluids. This result will allow the implementation of an instrumental NDT procedure for CFRP surface cleanliness assessment prior to bonding. Herein, results of our first experimental setup, based on the use of an array of polymer sensors for the detection of aeronautic fluids contamination, are presented.

Proposed optical techniques of extrasolar planet detection are discussed and compared. These include terrestrial, orbital, and moon-based systems. Terrestrial systems include ground-level searches for random eclipses of primaries and 'light' echoes of stellar flares from companion planets as well as balloon-mounted telescopes operating in the stratosphere used in conjunction with orbital occulters. Space telescopes considered are multimirror systems simulating huge mirror diameters and single-mirror telescopes, such as the 3-meter Large Space Telescope, used in conjunction with occulters. Although very modest systems are capable of detecting extrasolar planets, the amount of information we can gather regarding these worlds is a function of system complexity and program duration.

The accurate detection of small deviations in given density matrice is important for quantum information processing, which is a difficult task because of the intrinsic fluctuation in density matrices reconstructed using a limited number of experiments. We previously proposed a method for decoherence error detection using a machine-learning technique [S. Hara, T. Ono, R. Okamoto, T. Washio, and S. Takeuchi, Phys. Rev. A 89, 022104 (2014), 10.1103/PhysRevA.89.022104]. However, the previous method is not valid when the errors are just changes in phase. Here, we propose a method that is valid for arbitrary errors in density matrices. The performance of the proposed method is verified using both numerical simulation data and real experimental data.

Full Text Available In power transformers, locating the partial discharge (PD source is as important as identifying it. Acoustic Emission (AE sensing offers a good solution for both PD detection and PD source location identification. In this paper the principle of the AE technique, along with in-situ findings of the online acoustic emission signals captured from partial discharges on a number of Generator Transformers (GT, is discussed. Of the two cases discussed, the first deals with Acoustic Emission Partial Discharge (AEPD tests on two identical transformers, and the second deals with the AEPD measurement of a transformer carried out on different occasions (years. These transformers are from a hydropower station and a thermal power station in India. Tests conducted in identical transformers give the provision for comparing AE signal amplitudes from the two transformers. These case studies also help in comprehending the efficacy of integrating Dissolved Gas is (DGA data with AEPD test results in detecting and locating the PD source.

The main signature for anti-neutrino detection in reactor and geo-neutrino experiments based on scintillators is provided by the space-time coincidence of positron and neutron produced in the Inverse Beta Decay reaction. Such a signature strongly suppresses backgrounds and allows for measurements performed underground with a relatively high signal-to-background ratio. In an aboveground environment, however, the twofold coincidence technique is not sufficient to efficiently reject the high background rate induced by cosmogenic events. Enhancing the positron-neutron twofold coincidence efficiency has the potential to pave the way future aboveground detectors for reactor monitoring. We propose a new detection scheme based on a threefold coincidence, between the positron ionization, the ortho-positronium (o-Ps) decay, and the neutron capture, in a sandwich detector with alternated layers of plastic scintillator and aerogel powder. We present the results of a set of dedicated measurements on the achievable light y...

Quality control is an important issue in the ceramic tile industry. On the other hand maintaining the rate of production with respect to time is also a major issue in ceramic tile manufacturing. Again, price of ceramic tiles also depends on purity of texture, accuracy of color, shape etc. Considering this criteria, an automated defect detection and classification technique has been proposed in this report that can have ensured the better quality of tiles in manufacturing process as well as production rate. Our proposed method plays an important role in ceramic tiles industries to detect the defects and to control the quality of ceramic tiles. This automated classification method helps us to acquire knowledge about the pattern of defect within a very short period of time and also to decide about the recovery process so that the defected tiles may not be mixed with the fresh tiles.

Full Text Available Aim: Peripheral Arterial Disease(PAD and abdominal aorta aneurysm(AAA are frequent problems in geriatric population. In DSA, CTA or MRA techniques contrast agents has to be used for diagnosis that can be nephrotoxic for elderly patients. Magnetic resonans imaging (MRI is the most powerful, non-ionising radiological diagnostic tool that has the highest soft tissue contrast resolution. The aim of our study was to investigate the effectivity of MRI by the means of detecting the AAA and PAD in comparison with DSA. Material and Method: After getting ethical commitee approvel and informed consent, we have performed Balanced turbo field echo(B-TFE MRI technique without contrast agent in 1.5 Tesla MRI device before DSA examination. The luminal diameters of renal arteries, infrarenal abdominal aorta, iliac and femoral arteries was measured by using Philips DICOM Viewer R2.2 application. The intraclass corelation coefficient and reliability used to check if the techniques could be used for each other and the t-test was used to measure the differences between them. Results: There has been a high relationship between B-TFE and DSA in detecting the pathologies of larger arteries like aorta. In the case of small arterial pathologies, there is relatively lower relationship between BTFE and DSA. Discussion: For the diagnosis of AAA and PAD, DSA is the gold standart technique but it is invasive and patients have radiation exposure. In the follow up of geriatric patients with larger arterial pathologies B-TFE can be used instead of contrast enhanced MRA and invasive DSA.

Full Text Available The large amount of pesticide residues in the environment is a threat to global health by inhibition of acetylcholinesterase (AChE. Biosensors for inhibition of AChE have been thus developed for the detection of pesticides. In line with the rapid development of nanotechnology, nanomaterials have attracted great attention and have been intensively studied in biological analysis due to their unique chemical, physical and size properties. The aim of this review is to provide insight into nanomaterial-based optical techniques for the determination of AChE and pesticides, including colorimetric and fluorescent assays and surface plasmon resonance.

Many factors influence the measurement uncertainty when using computed tomography for dimensional metrology applications. One of the most critical steps is the surface extraction phase. An incorrect determination of the surface may significantly increase the measurement uncertainty. This paper...... presents an edge detection method for the surface extraction based on a 3D Canny algorithm with sub-voxel resolution. The advantages of this method are shown in comparison with the most commonly used technique nowadays, i.e. the local threshold definition. Both methods are applied to reference standards...

The relative proportions of hardener and resin (the stoichiometric ratio, SR) in a curing epoxy thermoset strongly influence the engineering properties of the material. We investigate how NMR, dielectric and ultrasonic techniques can be used to track cure and estimate SR in the material. We show that all three methods are sensitive to SR and can give clear and quantitative indications of excess resin. Detection of excess hardener in the finally cured material is more difficult but can be achieved by combined measures of elastic modulus, from the ultrasonic velocity, and loss angle, from ultrasonic attenuation.

A polymerase chain reaction (PCR) technique was used for detection of the Leptospira interrogans rrs gene in kidney tissue from 115 rats, 50 zebu cattles and 13 pigs in an attempt to identify a possible animal reservoir of leptospirosis in Madagascar. In addition, serological testing of 105 individuals in close contact with animals was carried out. The PCR analysis was negative for all the samples tested and only one person was found seropositive at a low titer. The findings suggest that leptospirosis, if prevalent in Madagascar, is likely rare.

Four hundred twenty pneumonic lungs from lambs were examined for Mycoplasma ovipneumoniae and Pasteurella haemolytica by an immunoperoxidase technique using an extravidin-biotin-peroxidase complex method in formalin-fixed, paraffin-embedded sections. Histologic examination of tissue sections revealed strong positive reactions in 60.9% and 68.3% of the lungs against M. ovipneumoniae and P. haemolytica, respectively. M. ovipneumoniae and P. haemolytica antigens were observed at the surface and/or within the epithelial cells, macrophages, leucocytes, and bronchiolar exudate. The location of M. ovipneumoniae in the cytoplasm of the epithelial cells and P. haemolytica in the neutrophils was detected immunohistochemically.

Progress in laser material processing may require real-time monitoring and process control for consistent quality and productivity. We report a method of in-situ monitoring of laser metal cutting and drilling using cladding power monitoring of an optical fibre beam delivery system—a technique which detects the light reflected or scattered from the workpiece. The light signal carries information about the quality of the process. Experiments involving drilling and cutting of two samples, a thin aluminum foil and a 2-mm thick stainless steel plate, confirmed the effectiveness of this method.

The confused flour beetle, Tribolium confusum Jacquelin du Val (Coleoptera: Tenebrionidae) is a stored-product pest that contaminates a wide range of food products, from flour and cereals to spices. The insect reduces food quality and is responsible for large economic losses every year. Although several methods for detection of stored-product pests are common and widely used, they are time-consuming and expensive. Therefore, establishing molecular methods of detection of stored-product pests could provide a useful alternative method. We have undertaken attempts to establish methods of detection of T. confusum based on molecular biology techniques of standard and real-time polymerase chain reaction (PCR). Total DNA of T. confusum and red flour beetle, Tribolium castaneum (Herbst) (Coleoptera: Tenebrionidae), used as a negative control, was isolated from insects and used as a template in standard and real-time PCR reactions. Specific primers have been designed on the basis of sequences of internal transcribed spacer (ITS) fragment of rDNA and subunit I of mitochondrial cytochrome oxidase of T. confusum available in the GenBank database. Standard PCR reactions with primers specific to the ITS fragment proved to be reliable and sensitive. Real-time PCR reactions with primers specific for mitochondrial DNA are considered to serve as a supplemental detection method for quantitative assessment of the infestation level.

In order to achieve high-resolution deep-tissue imaging, multi-photon fluorescence microscopy and photoacoustic tomography had been proposed in the past two decades. However, combining the advantages of these two imaging systems to achieve optical-spatial resolution with an ultrasonic-penetration depth is still a field with challenges. In this paper, we investigate the detection of the two-photon photoacoustic ultrasound, and first demonstrate background-free two-photon photoacoustic imaging in a phantom sample. To generate the background-free two-photon photoacoustic signals, we used a high-repetition rate femtosecond laser to induce narrowband excitation. Combining a loss modulation technique, we successfully created a beating on the light intensity, which not only provides pure sinusoidal modulation, but also ensures the spectrum sensitivity and frequency selectivity. By using the lock-in detection, the power dependency experiment validates our methodology to frequency-select the source of the nonlinearity. This ensures our capability of measuring the background-free two-photon photoacoustic waves by detecting the 2nd order beating signal directly. Furthermore, by mixing the nanoparticles and fluorescence dyes as contrast agents, the two-photon photoacoustic signal was found to be enhanced and detected. In the end, we demonstrate subsurface two-photon photoacoustic bio-imaging based on the optical scanning mechanism inside phantom samples.

The purpose of this study was to evaluate the detectability of various radiographic techniques for mandibular condylar lesions. Erosive lesion, osteophyte and flattening were formed on the artificial mandibular condyle, and panoramic, transcranial, transorbital radiography, lateral and frontal tomography were taken. The results were as follows; 1. The detectability for erosive lesions was superior in the order of frontal tomography (96%), lateral tomography (78%), transorbital (59%), transcranial (56%) and panoramic (48%) radiography. 2. The location of erosive lesion that showed the highest detectability was the medial third in panoramic, the lateral third in transcranial, the central portion of anteroposterior direction in transorbital, the central portion of mediolateral direction and the posterior third in lateral tomography. Frontal tomography disclosed all erosive lesions except one anterolateral lesion. 3. The detectability of osteophyte was 100% in lateral tomography, 78% in transcranial and 56% in panoramic radiography. 4. For flattening, lateral tomography showed the flattened condyle, but both panoramic and transcranial views showed only decreased bone density without the change of condylar shape.

Full Text Available A vision sensor was introduced and tested for early detection of citrus Huanglongbing (HLB. This disease is caused by the bacterium Candidatus Liberibacter asiaticus (CLas and is transmitted by the Asian citrus psyllid. HLB is a devastating disease that has exerted a significant impact on citrus yield and quality in Florida. Unfortunately, no cure has been reported for HLB. Starch accumulates in HLB infected leaf chloroplasts, which causes the mottled blotchy green pattern. Starch rotates the polarization plane of light. A polarized imaging technique was used to detect the polarization-rotation caused by the hyper-accumulation of starch as a pre-symptomatic indication of HLB in young seedlings. Citrus seedlings were grown in a room with controlled conditions and exposed to intensive feeding by CLas-positive psyllids for eight weeks. A quantitative polymerase chain reaction was employed to confirm the HLB status of samples. Two datasets were acquired; the first created one month after the exposer to psyllids and the second two months later. The results showed that, with relatively unsophisticated imaging equipment, four levels of HLB infections could be detected with accuracies of 72%–81%. As expected, increasing the time interval between psyllid exposure and imaging increased the development of symptoms and, accordingly, improved the detection accuracy.

Automatic oil spill detection and tracking from Synthetic Aperture Radar (SAR) images is a difficult task, due in large part to the inhomogeneous properties of the sea surface, the high level of speckle inherent in SAR data, the complexity and the highly non-Gaussian nature of amplitude information, and the low temporal sampling that is often achieved with SAR systems. This research presents a promising new oil spill detection and tracking method that is based on time series of SAR images. Through the combination of a number of advanced image processing techniques, the develop approach is able to mitigate some of these previously mentioned limitations of SAR-based oil-spill detection and enables fully automatic spill detection and tracking across a wide range of spatial scales. The method combines an initial automatic texture analysis with a consecutive change detection approach based on multi-scale image decomposition. The first step of the approach, a texture transformation of the original SAR images, is performed in order to normalize the ocean background and enhance the contrast between oil-covered and oil-free ocean surfaces. The Lipschitz regularity (LR), a local texture parameter, is used here due to its proven ability to normalize the reflectivity properties of ocean water and maximize the visibly of oil in water. To calculate LR, the images are decomposed using two-dimensional continuous wavelet transform (2D-CWT), and transformed into Holder space to measure LR. After texture transformation, the now normalized images are inserted into our multi-temporal change detection algorithm. The multi-temporal change detection approach is a two-step procedure including (1) data enhancement and filtering and (2) multi-scale automatic change detection. The performance of the developed approach is demonstrated by an application to oil spill areas in the Gulf of Mexico. In this example, areas affected by oil spills were identified from a series of ALOS PALSAR images

In the present study ⌀ 5''× 3'' and ⌀ 2''× 2'' EJ-313 liquid fluorocarbon as well as ⌀ 2'' × 3'' BaF2 scintillators were exposed to neutrons from a 252Cf neutron source and a Sodern Genie 16GT deuterium-tritium (D+T) neutron generator. The scintillators responses to β- particles with maximum endpoint energy of 10.4 MeV from the n+19F reactions were studied. Response of a ⌀ 5'' × 3'' BC-408 plastic scintillator was also studied as a reference. The β- particles are the products of interaction of fast neutrons with 19F which is a component of the EJ-313 and BaF2 scintillators. The method of fast neutron detection via fluorine activation is already known as Threshold Activation Detection (TAD) and was proposed for photofission prompt neutron detection from fissionable and Special Nuclear Materials (SNM) in the field of Homeland Security and Border Monitoring. Measurements of the number of counts between 6.0 and 10.5 MeV with a 252Cf source showed that the relative neutron detection efficiency ratio, defined as epsilonBaF2 / epsilonEJ-313-5'', is 32.0% ± 2.3% and 44.6% ± 3.4% for front-on and side-on orientation of the BaF2, respectively. Moreover, the ⌀ 5'' EJ-313 and side-on oriented BaF2 were also exposed to neutrons from the D+T neutron generator, and the relative efficiency epsilonBaF2 / epsilonEJ-313-5'' was estimated to be 39.3%. Measurements of prompt photofission neutrons with the BaF2 detector by means of data acquisition after irradiation (out-of-beam) of nuclear material and between the beam pulses (beam-off) techniques were also conducted on the 9 MeV LINAC of the SAPHIR facility.

This paper proposes a hybrid subcarrier multiplexing/optical spectrum code division multiplexing (SCM/OSCDM) system for the purpose of combining the advantages of both techniques. Optical spectrum code division multiple-access (OSCDMA) is one of the multiplexing techniques that is becoming popular because of the flexibility in the allocation of channels, ability to operate asynchronously, enhanced privacy and increased capacity in bursty nature networks. On the other hand, subcarrier multiplexing (SCM) technique is able to enhance the channel data rate of OSCDMA systems. In this paper, a newly developed detectiontechnique for the OSCDM called spectral direct decoding (SDD) detectiontechnique is compared mathematically with the AND subtraction detectiontechnique. The system utilizes a new unified code construction named KS (Khazani-Syed) code. The results characterizing the bit-error-rate (BER) show that SDD offers a significant improved performance at BER of 10 -9.

Continued interest in the development of miniaturized and portable analytical platforms necessitates the exploration of sensitive methods for the detection of trace analytes. Nanomaterials, on account of their unique physical and chemical properties, are not only able to overcome many limitations of traditional detection reagents but also enable the exploration of many new signal transduction technologies. This dissertation presents a series of investigations of alternative detectiontechniques for biomolecules, involving the use of semi-conductor nanocrystals and magnetic beads as labels. Initial research focused on the development of quantum dot-encapsulating liposomes as a novel fluorescent label for immunoassays. This hybrid nanomaterial was anticipated to overcome the drawbacks presented by traditional fluorophores as well as provide significant signal amplification. Quantum dot-encapsulating liposomes were synthesized by the method of thin film hydration and characterized. The utility of these composite nanostructures for bioanalysis was demonstrated. However, the longterm instability of the liposomes hampered quantitative development. A second approach for assay development exploited the ability of gold nanoparticles to quench the optical signals obtained from quantum dots. The goal of this study was to demonstrate the feasibility of using aptamer-linked nanostructures in FRET-based quenching for the detection of proteins. Thrombin was used as the model analyte in this study. Experimental parameters for the assay were optimized. The assay simply required the mixing of the sample with the reagents and could be completed in less than an hour. The limit of detection for thrombin by this method was 5 nM. This homogeneous assay can be easily adapted for the detection of a wide variety of biochemicals. The novel technique of ferromagnetic resonance generated in magnetic bead labels was explored for signal transduction. This inductive detectiontechnique lends

Full Text Available Aim: Aim of the present study was to compare different methods, viz., Sheather’s sugar flotation (SSF, Ziehl-Neelsen (ZN, Kinyoun’s acid-fast method (KAF, safranin-methylene blue staining (SMB, and negative staining techniques such as nigrosin staining, light green staining, and malachite green staining for the detection of Cryptosporidium spp. oocysts in bovines. Materials and Methods: A total of 455 fecal samples from bovines were collected from private, government farms and from the clinical cases presented to Department of Medicine, Veterinary College, Bengaluru. They were subjected for SSF, ZN, KAF, SMB and negative staining methods. Results: Out of 455 animal fecal samples screened 5.71% were found positive for Cryptosporidium spp. oocysts. The species were identified as Cryptosporidium parvum in calves and Cryptosporidium andersoni in adults based on the morphological characterization and micrometry of the oocysts. Conclusions: Of all the techniques, fecal flotation with sheather’s was found to be more specific and sensitive method for the detection of Cryptosporidium spp. oocysts. Among the conventional staining methods, the SMB gives better differentiation between oocysts and yeast. Among the three negative staining methods, malachite green was found sensitive over the other methods.

A new procedure for detection of condensed phases in the atmosphere from infrared remote sensing observations using singular value decomposition (SVD) of a set of altitude-dependent infrared extinction spectra is presented. Orthogonal rotation of derived pairs of singular vectors is used to transform several leading vectors in a way that makes them as similar as possible to a pair of reference vectors (measured spectra and vertical profiles). The technique is tested against the spectra of mesospheric ice recorded with high spectral resolution in the occultation mode by the infrared Fourier Transform Spectrometer on the Atmospheric Chemistry Experiment satellite. This SVD-based technique has shown to be successful in the detection of ice clouds in the upper mesosphere. It may also possess a considerable potential for solving the problem of decomposition of remote sensing observations with strongly overlapped compound spectral components that are typical for infrared observations of condensed phases (i.e. in polar stratospheric clouds).; First three right singular vectors (left panel) and first three left singular vectors (right panel) for a typical decomposed PMC-containing spectrum plotted against tangent altitude (left panel) and frequency (right panel) in the O-H stretch region

We consider a problem of identification of nuclides from weak and poorly resolved spectra. A two stage algorithm is proposed and tested based on the principle of majority voting. The idea is to model gamma-ray counts as Poisson processes. Then, the average part is taken to be the model and the difference between the observed gamma-ray counts and the average is considered as random noise. In the linear part, the unknown coefficients correspond to if isotopes of interest are present or absent. Lasso types of algorithms are applied to find non-vanishing coefficients. Since Lasso or any prediction error based algorithm is inconsistent with variable selection for finite data length, an estimate of parameter distribution based on subsampling techniques is added in addition to Lasso. Simulation examples are provided in which the traditional peak detection algorithms fail to work and the proposed two stage algorithm performs well in terms of both the False Negative and False Positive errors. - Highlights: > Identification of nuclides from weak and poorly resolved spectra. > An algorithm is proposed and tested based on the principle of majority voting. > Lasso types of algorithms are applied to find non-vanishing coefficients. > An estimate of parameter distribution based on sub-sampling techniques is included. > Simulations compare the results of the proposed method with those of peak detection.

Full Text Available In this work, how synchrotron radiation techniques can be applied for detecting the microstructure in metallic glass (MG is studied. The unit cells are the basic structural units in crystals, though it has been suggested that the co-existence of various clusters may be the universal structural feature in MG. Therefore, it is a challenge to detect microstructures of MG even at the short-range scale by directly using synchrotron radiation techniques, such as X-ray diffraction and X-ray absorption methods. Here, a feasible scheme is developed where some state-of-the-art synchrotron radiation-based experiments can be combined with simulations to investigate the microstructure in MG. By studying a typical MG composition (Zr70Pd30, it is found that various clusters do co-exist in its microstructure, and icosahedral-like clusters are the popular structural units. This is the structural origin where there is precipitation of an icosahedral quasicrystalline phase prior to phase transformation from glass to crystal when heating Zr70Pd30 MG.

The performance of small and dim IR target detection is mostly affected by the signal to noise ratio(SNR) and signal to clutter ratio(SCR), for the MWIR especially LWIR array detector, because of the background radiation and the optical system radiation, the SCR cannot be unlimited increased by using a longer integral time, so the frame rate of the detector was mainly limited by the data readout time especially in a large-scale infrared detector, in this paper a new MWIR array detector with windowing technique was used to do the experiment, which can get a faster frame rate around the target by using the windowing mode, so the redundant information could be ignore, and the background subtraction was used to remove the fixed pattern noise and adjust the dynamic range of the target, then a local NUC(non uniformity correction) technique was proposed to improve the SCR of the target, the advantage between local NUC and global NUC was analyzed in detail, finally the multi local window frame accumulation was adopted to enhance the target further, and the SNR of the target was improved. The experiment showed the SCR of the target can improved from 1.3 to 36 at 30 frames accumulation, which make the target detection and tracking become very easily by using the new method.

Full Text Available Background & aim: EnteroinvasiveEscherichia coliis one of the important agents of invasion to intestinal epithelial cells, damage and cell death which due to dysentery. The aim of this study wastoDetection of EnteroinvasiveEscherichia coli by PCR technique from Children’s Diarrheain yasuj. Methods:This cross-sectional study was performed on 200 stool samples taken from children with diarrhea in Yasuj. After initial identification of E.coli strains by culture and biochemical tests, EIEC gene such as ipaH detected by PCR technicque and antibiotic susceptibility of isolates was evaluated by using disc diffusion (CLSI method. Results: Out of all examined samples, 16(8% EIEC were separated. Antibiotic susceptibility test showed that the most susceptible antibiotic is ciprofloxacin for EIEC and also most resistant antibiotic is ceftizoxime. Conclusion: Results showed that EIEC strains have a moderate prevalence than other studies in our study area. Therefore, for importance of this strain to producing dysentery, hospital-wide surveillance using molecular techniques hase been proposed in other regions of country.

The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.

Full Text Available While Moore’s law scaling continues to double transistor density every technology generation, new design challenges are introduced. One of these challenges is variation, resulting in deviations in the behavior of transistors, most importantly in switching delays. These exaggerated delays widen the gap between the average and the worst case behavior of a circuit. Conventionally, circuits are designed to accommodate the worst case delay and are therefore becoming very limited in their performance advantages. Thus, allowing for an average case oriented design is a promising solution, maintaining the pace of performance improvement over future generations. However, to maintain correctness, such an approach will require on the fly mechanisms to prevent, detect, and resolve violations. This paper explores such mechanisms, allowing the improvement of circuit performance under intensifying variations. We present speculative error detectiontechniques along with recovery mechanisms. We continue by discussing their ability to operate under extreme variations including sub-threshold operation. While the main focus of this survey is on circuit approaches, for its completeness, we discuss higher-level, architectural and algorithmic techniques as well.

Full Text Available Coagulase-negative staphylococci (CNS, components of the normal flora of neonates, have emerged as important opportunistic pathogens of nosocomial infections that occur in neonatal intensive care units. Some authors have reported the ability of some CNS strains, particularly Staphylococcus epidermidis, to produce a toxin similar to S. aureus delta toxin. This toxin is an exoprotein that has a detergent action on the membranes of various cell types resulting in rapid cell lysis. The objectives of the present study were to standardize the Polymerase Chain Reaction (PCR technique for the detection of the gene responsible for the production of delta toxin (hld gene in staphylococcal species isolated from catheters and blood cultures obtained from neonates, and to compare the results to those obtained with the phenotypic synergistic hemolysis method. Detection of delta toxin by the phenotypic and genotypic method yielded similar results for the S. aureus isolates. However, in S. epidermidis, a higher positivity was observed for PCR (97.4% compared to the synergistic hemolysis method (86.8%. Among CNS, S. epidermidis was the most frequent isolate and was a delta toxin producer. Staphylococcus simulans and S. warneri tested positive by the phenotypic method, but their positivity was not confirmed by PCR for the hld gene detection. These results indicate that different genes might be responsible for the production of this toxin in different CNS species, requiring highly specific primers for their detection. PCR was found to be a rapid and reliable method for the detection of the hld gene in S. aureus and S. epidermidis.

Early recognition of ventricular fibrillation (VF) and electrical therapy are key for the survival of out-of-hospital cardiac arrest (OHCA) patients treated with automated external defibrillators (AED). AED algorithms for VF-detection are customarily assessed using Holter recordings from public electrocardiogram (ECG) databases, which may be different from the ECG seen during OHCA events. This study evaluates VF-detection using data from both OHCA patients and public Holter recordings. ECG-segments of 4-s and 8-s duration were analyzed. For each segment 30 features were computed and fed to state of the art machine learning (ML) algorithms. ML-algorithms with built-in feature selection capabilities were used to determine the optimal feature subsets for both databases. Patient-wise bootstrap techniques were used to evaluate algorithm performance in terms of sensitivity (Se), specificity (Sp) and balanced error rate (BER). Performance was significantly better for public data with a mean Se of 96.6%, Sp of 98.8% and BER 2.2% compared to a mean Se of 94.7%, Sp of 96.5% and BER 4.4% for OHCA data. OHCA data required two times more features than the data from public databases for an accurate detection (6 vs 3). No significant differences in performance were found for different segment lengths, the BER differences were below 0.5-points in all cases. Our results show that VF-detection is more challenging for OHCA data than for data from public databases, and that accurate VF-detection is possible with segments as short as 4-s. PMID:27441719

Among the available nuclear techniques, the neutron backscattering technique, based on the detection of the produced thermal neutrons, is thought to be the most promising for landmine detections. The results obtained from Monte Carlo simulation were used for selection of BF{sub 3} detector and Am-Be neutron source shielding. In addition, soil moisture was discussed as a limitation of the neutron backscattering technique. It was experimentally found that this technique is useful for soil whose water content is lower than 14%.

Effects of forest fires and implications are one of the most important natural disasters all over the world. Statistical data observed that forest fires had a variable structure in the last century in Turkey, but correspondingly the population growth amount of forest fires and burn area increase widely in recent years. Depending on this, erosion, landslides, desertification and mass loss come into existence. In addition; after forest fires, renewal of forests and vegetation are very important for land management. Classic methods used for detection of burn area and severity requires a long and challenging process due to time and cost factors. Thanks to advanced techniques used in the field of Remote Sensing, burn area and severity can be determined with high detail and precision. The purpose of this study based on blending MODIS (Moderate Resolution Imaging Spectradiometer) satellite images and spatial autocorrelation techniques together, thus detect burn area and severity absolutely. In this context, spatial autocorrelation statistics like Moran's I and Get is-Ord Local Gi indexes were used to measure and analyze to burned area characteristics. Prefire and postfire satellite images were used to determine fire severity depending on spectral indexes corresponding to biomass loss and carbon emissivity intensities. Satellite images have used for identification of fire damages and risks in terms of fire management for a long time. This study was performed using prefire and postfire satellite images and spatial autocorrelation techniques to determining and analyzing forest fires in Antalya, Turkey region which serious fires occurred. In this context, this approach enables the characterization of distinctive texture of burned area and helps forecasting more precisely. Finally, it is observed that mapping of burned area and severity could be performed from local scale to national scale. Key Words: Spatial autocorrelation, MODIS, Fire, Burn Severity

Individuals predominantly exchange information with one another through informal, interpersonal channels. During disasters and other disrupted settings, information spread through informal channels regularly outpaces official information provided by public officials and the press. Social scientists have long examined this kind of informal communication in the rumoring literature, but studying rumoring in disrupted settings has posed numerous methodological challenges. Measuring features of informal communication-timing, content, location-with any degree of precision has historically been extremely challenging in small studies and infeasible at large scales. We address this challenge by using online, informal communication from a popular microblogging website and for which we have precise spatial and temporal metadata. While the online environment provides a new means for observing rumoring, the abundance of data poses challenges for parsing hazard-related rumoring from countless other topics in numerous streams of communication. Rumoring about disaster events is typically temporally and spatially constrained to places where that event is salient. Accordingly, we use spatio and temporal subsampling to increase the resolution of our detectiontechniques. By filtering out data from known sources of error (per rumor theories), we greatly enhance the signal of disaster-related rumoring activity. We use these spatio-temporal filtering techniques to detect rumoring during a variety of disaster events, from high-casualty events in major population centers to minimally destructive events in remote areas. We consistently find three phases of response: anticipatory excitation where warnings and alerts are issued ahead of an event, primary excitation in and around the impacted area, and secondary excitation which frequently brings a convergence of attention from distant locales onto locations impacted by the event. Our results demonstrate the promise of spatio

In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detectiontechniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

The feasibility of applying Capacitively Coupled Contactless Conductivity Detection (C4D) technique to measurement of bubble velocity in gas-liquid two-phase flow in millimeter-scale pipe is investigated.And,a new method,which combines the C4D technique and the principle of cross-correlation velocity measurement,is proposed for the measurement of bubble velocity.This research includes two parts.First,based on the principle of C4D,a new five-electrode C4D sensor is developed.Then,with two conductivity signals obtained by the C4D sensor,the velocity measurement of bubble is implemented according to the principle of cross-correlation.The research results indicate that the C4D technique is highly effective and anticipates a broad potential in the field of two-phase flow.Experimental results show that the fiveelectrode C4D sensor is suitable for measuring the velocity of single bubbles with a relative error of less than 5％.

Full Text Available In order to predict the mechanical behaviour of a composite during its service life, it is important to study the initiation and development of cracks and its effects induced by degradation. The onset of damage is related to the structural integrity of the component and its fatigue life. For this, among other reasons, non–destructive techniques have been widely used nowadays in composite materials characterization such as acoustic emission (AE. This method has demonstrated excellent results on detecting and identifying initiations sites, cracking propagation and fracture mechanisms of polymer matrix composite materials. At the same time, mechanical behaviour has been related intimately to the reinforcement architecture. The goal of this paper is to remark the importance of acoustic emission technique as a unique tool for characterising mechanical parameters in response to external stresses and degradation processes. Some results obtained from different analysis are discussed to support the significance of using AE, technique that will be increased continuously in the composite materials field due to its several alternatives for understanding the mechanical behaviour, therefore the objective of this manuscript is to involve the benefits and advantages of AE in the materials characterization.

Full Text Available In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detectiontechniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

Significant renal artery stenosis (RAS) is a potentially curable cause of renovascular hypertension and/or renal impairment. It is caused by either atherosclerosis or fibromuscular dysplasia. Correct and timely diagnosis remains a diagnostic challenge. MR angiography (MRA) as a minimally invasive method seems to be suitable for RAS detection, however, its diagnostic value widely differs in the literature (sensitivity 62-100% and specificity 75-100%). The aim of our prospective study was to compare the diagnostic value of contrast-enhanced MRA utilizing parallel acquisition techniques in the detection of significant RAS with digital subtraction angiography (DSA). A total of 78 hypertensive subjects with suspected renal artery stenosis were examined on a 1.5 Tesla MR system using a body array coil. Bolus tracking was used to monitor the arrival of contrast agent to the abdominal aorta. The MRA sequence parameters were as follows: TR 3.7 ms; TE 1.2 ms; flip angle 25{sup o}; acquisition time 18 s; voxel size 1.1 mm x 1.0 mm x 1.1 mm; centric k-space sampling; parallel acquisition technique with acceleration factor of 2 (GRAPPA). Renal artery stenosis of 60% and more was considered hemodynamically significant. The results of MRA were compared to digital subtraction angiography serving as a standard of reference. Sensitivity and specificity of MRA in the detection of hemodynamically significant renal artery stenosis were 90% and 96%, respectively. Prevalence of RAS was 39% in our study population. Contrast-enhanced MRA with high spatial resolution offers sufficient sensitivity and specificity for screening of RAS.

Optical spectroscopy is an ideal method for detecting bacteria and spores in real time. Optical fluorescence spectroscopy examination of New York City aerosols is used to quantify the mass of bacteria spores present in air masses collected at 14 liters/minute onto silica fiber filters, and on silica fiber ribbons using an Environmental Beta Attenuation Monitor manufactured by MetOne Instruments configured for the PM2.5 fraction. Dipicolinic acid (DPA), a molecule found primarily in bacterial spores, is the most characteristic component of spores in trial experiments on over 200 collected aerosol samples. DPA is extracted from the spores using a heat bath and chelated with Terbium. The DPA:Tb is detected by measuring its characteristic fluorescence with emission bands at 490, 545 and 585 nm for 270 nm excitation. Light scattering also measures the size distribution for a number of a variety of bacteria - Bacillus subtilis (rod shaped), Staphylococcus aureus (spherical) and Pseudomonas aeruginosa (short rods) establishing that optical techniques satisfactorily distinguish populations based on their variable morphology. Size and morphology are obtained by applying a variation of the Gaussian Ray Approximation theory of anomalous diffraction theory to an analysis of the transmission spectra in the range of 0.4 to 1.0 microns. In test experiments, the refractive index of the inner spore core of Bacillus subtilis decreases from 1.51 to 1.39 while the spore radius enlarges from 0.38 to 0.6 micrometers. Optical determinations are verified by oil-immersion techniques and by scanning electron microscope measurements. Characterization of spores, germinating spore materials, and bacteria is considered vital to tracing bacteria in the environment, for the development of life-detection systems for planetary exploration, monitoring pathogens in environmental systems, and for the preparation of anti-terrorism strategies.

Recently, unstable trinucleotide repeats have been shown to be the etiologic factor in seven neuropsychiatric diseases, and they may play a similar role in other genetic disorders which exhibit genetic anticipation. We have tested one polymerase chain reaction (PCR)-based and two hybridization-based methods for direct detection of unstable DNA expansion in genomic DNA. This technique employs a single primer (asymmetric) PCR using total genomic DNA as a template to efficiently screen for the presence of large trinucleotide repeat expansions. High-stringency Southern blot hybridization with a PCR-generated trinucleotide repeat probe allowed detection of the DNA fragment containing the expansion. Analysis of myotonic dystrophy patients containing different degrees of (CTG){sub n} expansion demonstrated the identification of the site of trinucleotide instability in some affected individuals without any prior information regarding genetic map location. The same probe was used for fluorescent in situ hybridization and several regions of (CTG){sub n}/(CAG){sub n} repeats in the human genome were detected, including the myotonic dystrophy locus on chromosome 19q. Although limited at present to large trinucleotide repeat expansions, these strategies can be applied to directly clone genes involved in disorders caused by large expansions of unstable DNA. 33 refs., 4 figs.

The non-destructive nuclear material detectiontechnique is one of the novel methods under somewhat dangerous environments, for example, high level radiation or landmine areas. Specially, the detection of a landmine is a hot issue on the peaceful use of nuclear technology for human welfare. Generally, the explosives contain specific elements such as {sup 14}N or {sup 35}Cl. The photo-nuclear resonance gamma-rays are produced by nuclear reaction {sup 13}C(p , {gamma}){sup 14}N or {sup 34}S(p, {gamma}){sup 35}Cl in which target is bombarded by about 2MeV proton beam extracted from the proton accelerator. To avoid other neighboring resonant gamma-rays, we selected a higher resonant energy above 5MeV. The resonance gamma rays produced are absorbed or scattered when they react with {sup 14}N or {sup 35}Cl included in the mines and explosive. We can determine existence and position of mines or explosives by detecting the absorption and scattering gamma-ray signals.

A highly accurate method for the determination of the detection efficiency of a silicon single-photon avalanche diode (Si-SPAD) is presented. This method is based on the comparison of the detected count rate of the Si-SPAD compared to the photon rate determined from a calibrated silicon diode using a modified attenuator technique, in which the total attenuation is measured in two attenuation steps. Furthermore, a validation of this two-step method is performed using attenuators of higher transmittance. The setup is a tabletop one, laser-based, and fully automated. The measurement uncertainty components are determined and analyzed in detail. The obtained standard measurement uncertainty is < 0.5%. Main contributions are the transmission of the neutral density filters used as attenuators and the spectral responsivity of the calibrated analog silicon diode. Furthermore, the dependence of the detection efficiency of the Si-SPAD on the mean photon number of the impinging laser radiation with Poissonian statistics is investigated.

Full Text Available The DDOS is “distributed-denial-of-service” meaning many “zombies or daemons” computers performing a DOS (Denial of Service attack on one computer, usually directed by one “master”. In MANETs, DOS attacks not only consume the scarce system resources, such as bandwidth, battery energy, or CPU cycles, but also isolate legitimate users from a network. The DOS attacks may impact the network connectivity seriously and may further undermine the networking functions. In DRDOS attacks, the victim is bombarded by reflected response packets from legitimate communicating nodes, and thus it is difficult to distinguish attack packets from legitimate packets. In this paper, we propose a defense mechanism based on CARD based DRDOS attack detection and prevention techniques in MANET. The proposed rate limiting scheme will penalize the different attackers based on their rate limits and server load. The victim end defense system decrease the rate limit exponentially & increase it linearly based on the attack traffic rate. Finally this approach is discussed in three phases as detection, control and prevention which is explained in CARD detection architecture.

We investigate the parameters that govern an unsupervised anomaly detection framework that uses nonlinear techniques to learn a better model of the non-anomalous data. A manifold or kernel-based model is learned from a small, uniformly sampled subset in order to reduce computational burden and under the assumption that anomalous data will have little effect on the learned model because their rarity reduces the likelihood of their inclusion in the subset. The remaining data are then projected into the learned space and their projection errors used as detection statistics. Here, kernel principal component analysis is considered for learning the background model. We consider spectral data from an 8-band multispectral sensor as well as panchromatic infrared images treated by building a data set composed of overlapping image patches. We consider detection performance as a function of patch neighborhood size as well as embedding parameters such as kernel bandwidth and dimension. ROC curves are generated over a range of parameters and compared to RX performance.

The main signature for anti-neutrino detection in reactor and geo-neutrino experiments based on scintillators is provided by the space–time coincidence of positron and neutron produced in the Inverse Beta Decay reaction. Such a signature strongly suppresses backgrounds and allows for measurements performed underground with a relatively high signal-to-background ratio. In an aboveground environment, however, the twofold coincidence technique is not sufficient to efficiently reject the high background rate induced by cosmogenic events. Enhancing the positron–neutron twofold coincidence efficiency may pave the way to future aboveground detectors for reactor monitoring. We propose a new detection scheme based on a threefold coincidence, among the positron ionization, the ortho-positronium (o-Ps) decay, and the neutron capture, in a sandwich detector with alternated layers of plastic scintillator and aerogel powder. We present the results of a set of dedicated measurements on the achievable light yield and on the o-Ps formation and lifetime. The efficiencies for signal detection and background rejection of a preliminary detector design are also discussed.

In this paper, a new readout strategy for CMOS image sensors is presented. It aims to overcome the excessive output dataflow bottleneck; this challenge is becoming more and more crucial along with the technology miniaturization. This strategy is based on the spatial redundancies suppression. It leads the sensor to perform edge detection and eventually provide binary image. One of the main advantages of this readout technique compared to other techniques, existing in the literature, is that it does not affect the in-pixel circuitry. This means that all the analogue processing circuitry is implemented outside the pixel, which keeps the pixel area and Fill Factor unchanged. The main analogue block used in this technique is an event detector developed and designed in the CMOS 0.35μm technology from Austria Micro Systems. The simulation results of this block as well as the simulation results of a test bench composed of several pixels and column amplifiers using this readout mode show the capability of this readout mode to reduce dataflow by controlling the ADCs. We must mention that this readout strategy is applicable on sensors that use a linear operating pixel element as well as for those based on logarithmic operating pixels. This readout technique is emulated by a MATLAB model which gives an idea about the expected functionalities and dataflow reduction rates (DRR). Emulation results are shown lately by giving the pre and post processed images as well as the DRR. This last cited does not have a fix value since it depends on the spatial frequency of the filmed scenes and the chosen threshold value.

Full Text Available Objective: The aim of the present study was to compare the performance of laser fluorescence (LF method with other conventional diagnostic techniques in detection of smallocclusal caries in permanent teeth.Materials and Methods: Prior to this in vitro diagnostic study, a pilot study assessed intra-examiner reliability and reproducibility. The occlusal surfaces of 90 extracted human premolars were examined with four diagnostic methods: probing, visual inspection, bitewing(BW radiographs, and LF. The teeth were then sectioned for the purpose of histological examination. The data were analyzed using SPSS 15 software, and sensitivity,specificity and other diagnostic criteria of the techniques were calculated.Results: The intra-examiner reproducibility for probing and also radiographic techniques was 100%. The corresponded figure for LF (88% was more than visual inspection (82%.The highest level of validity of the examiner turned out to be in probing technique(76.9%. Sensitivity of visual inspection, probing, and LF methods was 54.5% and that of BW radiography was 27.5%. Specificity and efficiency of LF method were 84.8% and 81.1%, respectively. Probing and visual inspection showed the highest specificity (97.5% and 94.9, respectively and efficiency (92.2% and 90%, respectively among the methods.Conclusion: Specificity and efficiency of LF method were lower compared to those of other methods. Among all the investigated methods, the most efficient methods in the diagnosis of small occlusal caries in permanent teeth were probing and visual inspection,respectively.

Photometric measurements are prone to systematic errors presenting a challenge to low-amplitude variability detection. In search for a general-purpose variability detectiontechnique able to recover a broad range of variability types including currently unknown ones, we test 18 statistical characteristics quantifying scatter and/or correlation between brightness measurements. We compare their performance in identifying variable objects in seven time series data sets obtained with telescopes ranging in size from a telephoto lens to 1 m-class and probing variability on time-scales from minutes to decades. The test data sets together include light curves of 127 539 objects, among them 1251 variable stars of various types and represent a range of observing conditions often found in ground-based variability surveys. The real data are complemented by simulations. We propose a combination of two indices that together recover a broad range of variability types from photometric data characterized by a wide variety of sampling patterns, photometric accuracies and percentages of outlier measurements. The first index is the interquartile range (IQR) of magnitude measurements, sensitive to variability irrespective of a time-scale and resistant to outliers. It can be complemented by the ratio of the light-curve variance to the mean square successive difference, 1/η, which is efficient in detecting variability on time-scales longer than the typical time interval between observations. Variable objects have larger 1/η and/or IQR values than non-variable objects of similar brightness. Another approach to variability detection is to combine many variability indices using principal component analysis. We present 124 previously unknown variable stars found in the test data.

The SFS technology has already proved its analytical capabilities in a variety of industrial and environmental tasks. Recently it has been introduced for forensic applications. The key features of the SFS method - measuring a 3-dimensional spectrum of fluorescence of the sample (intensity versus excitation and emission wavelengths) with following recognition of specific spectral patterns of SFS responsible for individual drugs - provide an effective tool for the analysis of untreated seized samples, without any separation of the substance of interest from its mixture with accompanying cutting agents and diluents as a preparatory step. In such approach the chemical analysis of the sample is substituted by the analysis of SFS matrix visualized as an optical image. The SFS technology of drug detection is realized by NarTest® NTX2000 analyzer, compact device intended to measure suspicious samples in liquid, solid and powder forms. It simplifies the detection process due to fully automated procedures of SFS measuring and integrated expert system for recognition of spectral patterns. Presently the expert system of NTX2000 is able to detect marijuana, cocaine, heroin, MDMA, amphetamine and methamphetamine with the detection limit down to 5% of the drug concentration in various mixtures. The numerous tests with street samples confirmed that the use of SFS method provides reliable results with high sensitivity and selectivity for identification of drugs of abuse. More than 3000 street samples of the aforesaid drugs were analyzed with NTX2000 during validation process, and the correspondence of SFS results and conclusions of standard forensic analyses with GC/MS techniques was in 99.4% cases.

Photometric measurements are prone to systematic errors presenting a challenge to low-amplitude variability detection. In search for a general-purpose variability detectiontechnique able to recover a broad range of variability types including currently unknown ones, we test 18 statistical characteristics quantifying scatter and/or correlation between brightness measurements. We compare their performance in identifying variable objects in seven time-series datasets obtained with telescopes ranging in size from a telephoto lens to 1 m-class and probing variability on timescales from minutes to decades. The test datasets together include lightcurves of 127539 objects, among them 1251 variable stars of various types and represent a range of observing conditions often found in ground-based variability surveys. The real data are complemented by simulations. We propose a combination of two indices that together recover a broad range of variability types from photometric data characterized by a wide variety of sampling patterns, photometric accuracies, and percentages of outlier measurements. The first index is the interquartile range (IQR) of magnitude measurements, sensitive to variability irrespective of a timescale and resistant to outliers. It can be complemented by the ratio of the lightcurve variance to the mean square successive difference, 1/η, which is efficient in detecting variability on timescales longer than the typical time interval between observations. Variable objects have larger 1/η and/or IQR values than non-variable objects of similar brightness. Another approach to variability detection is to combine many variability indices using principal component analysis. We present 124 previously unknown variable stars found in the test data.

We present an investigation of charge-dependent physical properties of water-soluble synthetic polymers and polymer-based layered structures, using optical detection methods in a visible range. We apply in situ nanoscale optical techniques to study response of polymer systems to changes in pH, polymer concentration, and concentration and type of counterions. This work describes three optical techniques and custom built instrumental setups for nanoscale polymer characterization in aqueous environment. Phase-modulated ellipsometry was applied to determine the refraction coefficient and the thickness of a hydrogel-like polymer layer on a substrate. The present study describes the sensitivity of the phase modulated ellipsometry to errors of measurement and determines conditions for decoupling film thickness and refraction index. It is shown that, for a certain range of film thickness, both the thickness and the refractive index can be determined from a single measurement with high precision. This optimal range of the film thickness is calculated for organic thin films, and the analysis is tested on crosslinked poly(methacrylic acid) polymer films in air and in water. Fluorescent correlation spectroscopy was used to investigate diffusion of a synthetic polyelectrolyte in aqueous solutions. Translational diffusion of Alexa-labeled poly(methacrylic acid) chains was studied in very dilute, 10-4 mg/ml, solutions as a function of polymer charge density and counterion concentration. The results illustrate the utility of the technique for studying hydrodynamic sizes of polyelectrolyte coils in response to variation in solution pH or concentration of salt and polyelectrolytes. We apply surface-enhanced Raman scattering (SERS) for studying of enhancement capabilities of individual silver nanoparticles attached to glass and silicon substrates. Nanoparticles were electrostatically bound to a self-assembled polyallylamine hydrochloride (PAH) monolayer, which was deposited on

A new Compton X-ray backscatter imaging technique called lateral migration radiography (LMR) is applied to detecting a class of sub-surface defects in materials and structures of industrial importance. These include flaws and defects for which there is either no known method or an effective method for detection. Examples are delamination in layered composite structures, defects in deposited coatings on metal surfaces such as in aircraft jet engine components and geometrical structural/composition changes (e.g. due to corrosion) on the inside of shell-like components with only outside surface area access.Research efforts include: the construction of simulated flawed test objects on which experimental measurements are performed to establish LMR flaw detection capabilities; performance of Monte Carlo simulations of these measurements to assist in predicting optimum source-detector configurations and to help obtain a detailed understanding of the physics of lateral migration in small voids and how this impacts the resulting LMR image contrasts; the procurement of samples of materials of industrial importance with flaws and defects; the application of LMR to the detection of flaws and defects in these samples; the development of a multi-detector scanning system to provide for faster, more effective flaw detection; and a determination, for the types of samples examined, of the limits and capabilities of flaw detection using LMR.LMR imaging measurements on the machined samples showed that the optimum contrast in flaw-to-background signal intensity occurred at an X-ray energy of 75 kVp for the aluminum samples and at 35 kVp for the Delrin sample. Monte Carlo simulations and experimental measurements on the aluminum samples showed that LMR is capable of detecting defects down to the tens of microns range. Measurements on the aluminum samples also showed that LMR is capable of detecting relatively small composition variations; a 30 % difference in image intensity was

Full Text Available Concrete is certainly prone to internal deteriorations or defects during the construction and operating periods. Compared with other nondestructive techniques, infrared thermography can easily detect the subsurface delamination in a very short period of time, but accurately identifying its size and depth in concrete is a very challenging task. In this study, experimental testing was carried out on a concrete specimen having internal delaminations of various sizes and at varying depths. Delaminations at 1 and 2 cm deep showed a good temperature contrast after only 5-minute heating, but delaminations at 3 cm practically identified the value of the temperature contrast from heating of 15 minutes. In addition, the size of the delamination at 3 cm deep could be estimated with a difference of 10% to 28% for 20 minutes of heating. The depth of the delamination was linearly correlated with the increase in its size.

A goaf area is referred to as a cavity where coal has been removed or mined out. These cavities will change the original geostress equilibrium of stratigraphic system and cause local geostress focusing or concentration. Consequently, the surrounding rock of a goaf may be deformed, fractured, displaced and caved resulting from the combined effect of gravity and geostress. In the cases of little or no effective mining control, widespread cracks, fractures and even subsidence of the rock mass above the goaf will not only lead to groundwater depletion, farmland destruction and deterioration of ecological environment, but also present a serious threat to the mining safety, engineering construction, and even people's lives and property. So, it is important to locate the boundary of the goaf and to evaluate its stability in order to provide the basis for comprehensive control in the latter period of mining. This article attempts to explore a new geophysical method - 2D microtremor profiling technique for goaf detection and mapping. 2D microtremor profiling technique is based on the microtremor array theory (Aki, 1957; Ling, 1994; Okada, 2003) utilizing spatial autocorrelation analysis to obtain Rayleigh-wave dispersion curves for apparent S-wave velocity (Vx) calculation (Ling & Miwa, 2006;Xu et al.,2012). A laterally continuous S-wave velocity section can then be obtained through data interpolation. The final result will be used for interpreting lateral changes in lithology and geological structures. Let's take a case study in Henan Province of China as an example. The coal seams in the survey area were about 150 ~ 250m deep. A triple-circular array was adopted for acquiring microtremor data, with the observation radius in 20, 40 and 80m, respectively, and a sampling the interval of 50m. We observed the following characteristics of the goaf area from the microtremor Vx section: (1) obvious low pseudo velocity anomaly corresponding to limestone layer below the goaf; (2

Nuclear quadrupole resonance (NQR) a highly promising new technique for bulk explosives detection: relatively inexpensive, more compact than NMR, but with considerable selectivity. Since the NQR frequency is insensitive to long-range variations in composition, mixing explosives with other materials, such as the plasticizers in plastic explosives, makes no difference. The NQR signal strength varies linearly with the amount of explosive, and is independent of its distribution within the volume monitored. NQR spots explosive types in configurations missed by the X-ray imaging method. But if NQR is so good, why it is not used everywhere? Its main limitation is the low signal-to-noise ratio, particularly with the radio-frequency interference that exists in a field environment, NQR polarization being much weaker than that from an external magnetic field. The distinctive signatures are there, but are difficult to extract from the noise. In addition, the high selectivity is partly a disadvantage, as it is hard to bui...

Charge transport by tunnelling is one of the most ubiquitous elementary processes in nature. Small structural changes in a molecular junction can lead to significant difference in the single-molecule electronic properties, offering a tremendous opportunity to examine a reaction on the single-molecule scale by monitoring the conductance changes. Here, we explore the potential of the single-molecule break junction technique in the detection of photo-thermal reaction processes of a photochromic dihydroazulene/vinylheptafulvene system. Statistical analysis of the break junction experiments provides a quantitative approach for probing the reaction kinetics and reversibility, including the occurrence of isomerization during the reaction. The product ratios observed when switching the system in the junction does not follow those observed in solution studies (both experiment and theory), suggesting that the junction environment was perturbing the process significantly. This study opens the possibility of using nano-structured environments like molecular junctions to tailor product ratios in chemical reactions.

Based on the 5-year length of tidal gravity observations recorded with a superconducting gravimeter at Wuhan International Tidal Gravity Reference Station, the special gravity signals associated with the possible Earth's solid inner core translational oscillations in sub-tidal bands are detected and studied by using for the first time a wavelet transformation technique. The analysis is conducted on gravity residuals after removing the synthetic tidal gravity signals and air pressure perturbation from original observations, demonstrating that there exist gravity oscillation signals at 4-6 h bands with amplitude of nGal level. However, it is found that the frequency and amplitude of such kind of oscillation signals change with time, and the analysis shows that these oscillation signals are provoked probably by some non-continuous source with very low amplitude.

Potential approaches for reducing enteric methane (CH4) emissions from cattle will require verification of their efficacy at the paddock scale. We designed a micrometeorological approach to compare emissions from two groups of grazing cattle. The approach consists of measuring line-averaged CH4 mole fractions upwind and downwind of each group and using a backward-Lagrangian stochastic model to compute CH4 emission rates from the observed mole fractions, in combination with turbulence statistics measured by a sonic anemometer. With careful screening for suitable wind conditions, a difference of 10% in group emission rates could be detected. This result was corroborated by simultaneous measurements of daily CH4 emissions from each animal with the sulfur hexafluoride (SF6) tracer-ratio technique.

We report on the new optical gating technique used for the direct photoconductive detection of short pulses of terahertz radiation with the resolution up to 250 femtoseconds. The femtosecond optical laser pulse time delayed with respect to the THz pulse generated a large concentration of the electron hole pairs in the AlGaAs/InGaAs High Electron Mobility Transistor (HEMT) drastically increasing the conductivity on the femtosecond scale and effectively shorting the source and drain. This optical gating quenched the response of the plasma waves launched by the THz pulse and allowed us to reproduce the waveform of the THz pulse by varying the time delay between the THz and quenching optical pulses. The results are in excellent agreement with the electro-optic effect measurements and with our hydrodynamic model that predicts the ultra-fast transistor plasmonic response at the time scale much shorter than the electron transit time, in full agreement with the measured data.

In telemedicine while transferring medical images tampers may be introduced. Before making any diagnostic decisions, the integrity of region of interest (ROI) of the received medical image must be verified to avoid misdiagnosis. In this paper, we propose a novel fragile block based medical image watermarking technique to avoid embedding distortion inside ROI, verify integrity of ROI, detect accurately the tampered blocks inside ROI, and recover the original ROI with zero loss. In this proposed method, the medical image is segmented into three sets of pixels: ROI pixels, region of noninterest (RONI) pixels, and border pixels. Then, authentication data and information of ROI are embedded in border pixels. Recovery data of ROI is embedded into RONI. Results of experiments conducted on a number of medical images reveal that the proposed method produces high quality watermarked medical images, identifies the presence of tampers inside ROI with 100% accuracy, and recovers the original ROI without any loss.

Fires in general and forest fires specific are a major concern in terms of economical and biological loses. Remote sensing technologies have been focusing on developing several algorithms, adapted to a large kind of sensors, platforms and regions in order to obtain hotspots as faster as possible. The aim of this study is to establish an automatic methodology to develop hotspots detection algorithms with Spinning Enhanced Visible and Infrared Imager (SEVIRI) sensor on board Meteosat Second Generation platform (MSG) based on machine learning techniques that can be exportable to others geostationary platforms and sensors and to any area of the Earth. The sensitivity (SE), specificity (SP) and accuracy (AC) parameters have been analyzed in order to develop the final machine learning algorithm taking into account the preferences and final use of the predicted data.

To date, significant exploitations of the coal mines have left a considerable number of undetermined empty spaces, also known as the gob areas, behind. The existence of these areas could make the overlaying terrane lose the gravity support. The inhomogeneous sinkage of the overlaying terrane could destroy the buildings constructed on it dramatically, which has currently been a classical geological disaster. In the current study, the crosswell seismic mechanism was addressed and applied to detect the gob area distribution and, espcially, to measure the compaction extent of the gob areas. The results clearly show that the crosswell seismic technique is a very powerful method to discover the distribution and compation degree of the gob areas. More importantly, the current findings provided a novel way for evaluating the compaction extent of the gob areas.

Full Text Available This paper investigates the possibility to improve target detection capability in a DVB-T- based passive radar sensor by jointly exploiting multiple digital television channels broadcast by the same transmitter of opportunity. Based on the remarkable results obtained by such a multi-frequency approach using other signals of opportunity (i.e., FM radio broadcast transmissions, we propose appropriate modifications to the previously devised signal processing techniques for them to be effective in the newly considered scenarios. The resulting processing schemes are extensively applied against experimental DVB-T-based passive radar data pertaining to different surveillance applications. The obtained results clearly show the effectiveness of the proposed multi-frequency approaches and demonstrate their suitability for application in the considered scenarios.

Full Text Available Remote sensing and GIS techniques have been used to detect the shoreline changes along Miankaleh peninsula promontory of the Gorgan Bay entrance over the last three decades (1975-2002. For this purpose satellite data including LANDSAT ETM+, TM, SPOT, ASTER L1A and RADARSAT have been analyzed. SPOT-Pan data were georeferenced with respect to 1 : 50 000 topographic maps using a Universal Transverse Mercator (UTM projection, then all the needed data sets were registered to the SPOT-Pan image. The hydrological data showed a rapid rise of the Caspian Sea level by 2.6 m between “1975-1996”.

The Center for Cave and Karst Studies, CCKS, has been using microgravity to locate caves from the ground's surface since 1985. The geophysical subsurface investigations began during a period when explosive and toxic vapors were rising from the karst aquifer under Bowling Green into homes, businesses, and schools. The USEPA provided the funding for this Superfund Emergency, and the CCKS was able to drill numerous wells into low-gravity anomalies to confirm and even map the route of caves in the underlying limestone bedrock. In every case, a low-gravity anomaly indicated a bedrock cave, a cave with a collapsed roof or locations where a bedrock cave had collapsed and filled with alluvium. At numerous locations, several wells were cored into microgravity anomalies and in every case, additional wells were drilled on both sides of the anomalies to confirm that the technique was in fact reliable. The wells cored on both sides of the anomalies did not intersect caves but instead intersected virtually solid limestone. Microgravity also easily detected storm sewers and even sanitary sewers, sometimes six meters (twenty feet) beneath the surface. Microgravity has also been used on many occasions to investigate sinkhole collapses. It identified potential collapse areas by detecting voids in the unconsolidated material above bedrock. The system will soon be tested over known tunnels and then during a blind test along a section of the U.S. border at Nogales, Arizona. The CCKS has experimented with other geophysical techniques, particularly ground penetrating radar, seismic and electrical resistivity. In the late 1990s the CCKS started using the Swift/Sting resistivity meter to perform karst geophysical subsurface investigations. The system provides good depth to bedrock data, but it is often difficult to interpret bedrock caves from the modeled data. The system typically used now by the CCKS to perform karst subsurface investigations is to use electrical resistivity traverses

Full Text Available Natural crude-oil seepages, together with the oil released into seawater as a consequence of oil exploration/production/transportation activities, and operational discharges from tankers (i.e., oil dumped during cleaning actions represent the main sources of sea oil pollution. Satellite remote sensing can be a useful tool for the management of such types of marine hazards, namely oil spills, mainly owing to the synoptic view and the good trade-off between spatial and temporal resolution, depending on the specific platform/sensor system used. In this paper, an innovative satellite-based technique for oil spill detection, based on the general robust satellite technique (RST approach, is presented. It exploits the multi-temporal analysis of data acquired in the visible channels of the Moderate Resolution Imaging Spectroradiometer (MODIS on board the Aqua satellite in order to automatically and quickly detect the presence of oil spills on the sea surface, with an attempt to minimize “false detections” caused by spurious effects associated with, for instance, cloud edges, sun/satellite geometries, sea currents, etc. The oil spill event that occurred in June 2007 off the south coast of Cyprus in the Mediterranean Sea has been considered as a test case. The resulting data, the reliability of which has been evaluated by both carrying out a confutation analysis and comparing them with those provided by the application of another independent MODIS-based method, showcase the potential of RST in identifying the presence of oil with a high level of accuracy.

In this paper, a nondestructive technique is used as a tool to control cracks and microcracks in materials. A simulation by a numerical approach such as the finite element method is employed to detect cracks and eventually; to study their propagation using a crucial parameter such as the stress intensity factor. This approach has been used in the aircraft industry to control cracks. Besides, it makes it possible to highlight the defects of parts while preserving the integrity of the controlled products. On the other side, it is proven that the reliability of the control of defects gives convincing results for the improvement of the quality and the safety of the material. Eddy current testing (ECT) is a standard technique in industry for the detection of surface breaking flaws in magnetic materials such as steels. In this context, simulation tools can be used to improve the understanding of experimental signals, optimize the design of sensors or evaluate the performance of ECT procedures. CEA-LIST has developed for many years semi-analytical models embedded into the simulation platform CIVA dedicated to non-destructive testing. The developments presented herein address the case of flaws located inside a planar and magnetic medium. Simulation results are obtained through the application of the Volume Integral Method (VIM). When considering the ECT of a single flaw, a system of two differential equations is derived from Maxwell equations. The numerical resolution of the system is carried out using the classical Galerkin variant of the Method of Moments. Besides, a probe response is calculated by application of the Lorentz reciprocity theorem. Finally, the approach itself as well as comparisons between simulation results and measured data are presented.

Full Text Available Investigating the intensity of atmospheric water deposition and its diurnal distribution is essential from the ecological perspective, especially regarding dry geographic regions. It is also important in the context of monitoring the amount of moisture present within building materials in order to protect them from excessive humidity. The objective of this study was to test a constructed sensor and determine whether it could detect and track changes in the intensity of atmospheric water deposition. An operating principle of the device is based on the time-domain reflectometry technique. Two sensors of different plate volumes were manufactured. They were calibrated at several temperatures and tested during field measurements. The calibration turned out to be temperature independent. The outdoor measurements indicated that the upper limits of the measurement ranges of the sensors depended on the volumes of the plates and were equal to 1:2 and 2:8 mm H2O. The respective sensitivities were equal to 3.2 x 10-3 and 7.5 x 10-3 g∙ps-1. The conducted experiments showed that the construction of the designed device and the time-domain reflectometry technique were appropriate for detecting and tracing the dynamics of atmospheric water deposition. The obtained outcomes were also collated with the readings taken in an actual soil sample. For this purpose, an open container sensor, which allows investigating atmospheric water deposition in soil, was manufactured. It turned out that the readings taken by the porous ceramic plate sensor reflected the outcomes of the measurements performed in a soil sample.

The Toba Caldera formed from large depression of Quaternary volcanism is a remarkable feature at the Earth surface. The last Toba super eruptions were recorded around 73 ka and produced the Youngest Toba Tuff about 2,800 km3. Since then, there is no record of significant volcanic seismicity at Toba Volcanic Complex (TVC). However, the hydrothermal activities are still on going as presented by the existence of hot springs and alteration zones at the northwest caldera. The hydrothermal fluids probably containing some chemical compositions mixed with surficial water pollutant and contaminated the Toba Lake. Therefore, an environmental issues related to the existence of chemical composition and degradation of water clearness in the lake had been raised in the local community. The pollutant sources are debatable between natural and anthropogenic influences because some human activities grow rapidly at and around the lake such as hotels, tourisms, husbandry, aquaculture, as well as urbanization. Therefore, obtaining correct information about the source materials floating at the surface of the Toba Lake is crucial for environmental and hazard mitigation purposes. Overcoming the problem, we presented this paper to assess the source possibility of floating materials at Toba Lake, especially from natural sources such as hydrothermal activities of TVC and river stream sediments. The Spectral Angle Mapper (SAM) techniques using atmospherically corrected of Landsat-8 and colour composite of Polarimetric Synthetic Aperture Radar (PolSAR) were used to map the distribution of floating materials. The seven ground truth points were used to confirm the correctness of proposed method. Based on the SAM and PolSAR techniques, we could detect the interface of hydrothermal fluid at the lake surfaces. Various distributions of stream sediment were also detected from the river mouth to the lake. The influence possibilities of the upwelling process from the bottom floor of Toba Lake were also

We discuss the optical-heterodyne-detectiontechnique to study the absorption and dispersion of a probe beam propagating through a medium with a narrow resonance. The technique has been demonstrated for Rydberg electromagnetically induced transparency in rubidium thermal vapor and the optical nonlinearity of a probe beam with variable intensity has been studied. A quantitative comparison of the experimental result with a suitable theoretical model is presented. The limitations and the working regime of the technique are discussed.

Full Text Available In this paper, broken rotor bar (BRB fault is investigated by utilizing the Motor Current Signature Analysis (MCSA method. In industrial environment, induction motor is very symmetrical, and it may have obvious electrical signal components at different fault frequencies due to their manufacturing errors, inappropriate motor installation, and other influencing factors. The misalignment experiments revealed that improper motor installation could lead to an unexpected frequency peak, which will affect the motor fault diagnosis process. Furthermore, manufacturing and operating noisy environment could also disturb the motor fault diagnosis process. This paper presents efficient supervised Artificial Neural Network (ANN learning technique that is able to identify fault type when situation of diagnosis is uncertain. Significant features are taken out from the electric current which are based on the different frequency points and associated amplitude values with fault type. The simulation results showed that the proposed technique was able to diagnose the target fault type. The ANN architecture worked well with selecting of significant number of feature data sets. It seemed that, to the results, accuracy in fault detection with features vector has been achieved through classification performance and confusion error percentage is acceptable between healthy and faulty condition of motor.

To discuss the feasibility of using the serum's multi-optical path length spectroscopy information for measuring the concentration of the human blood components, the automatic micro-displacement measuring device was designed, which can obtain the near-infrared multi-optical path length from 0 to 4.0 mm (interval is 0.2 mm) spectra of 200 serum samples with multioptical path length spectrum of serum participated in building the quantitative analysis model of four components of the human blood: glucose (GLU), total cholesterol (TC), total protein (TP) and albumin (ALB), by mean of the significant non-linear spectral characteristic of blood. Partial least square (PLS) was used to set up the calibration models of the multi-optical path length near-infrared absorption spectrum of 160 experimental samples against the biochemical analysis results of them. The blood components of another 40 samples were predicted according to the model. The prediction effect of four blood components was favorable, and the correlation coefficient (r) of predictive value and biochemical analysis value were 0.9320, 0.9712, 0.9462 and 0.9483, respectively. All of the results proved the feasibility of the multi-optical path length spectroscopy technique for blood components analysis. And this technique established the foundation of detecting the components of blood and other liquid conveniently and rapidly.

This paper proposes an intelligent intrusion detection system (IDS) which is an integrated approach that employs fuzziness and two of the well-known data mining techniques: namely classification and association rule mining. By using these two techniques, we adopted the idea of using an iterative rule learning that extracts out rules from the data set. Our final intention is to predict different behaviors in networked computers. To achieve this, we propose to use a fuzzy rule based genetic classifier. Our approach has two main stages. First, fuzzy association rule mining is applied and a large number of candidate rules are generated for each class. Then the rules pass through pre-screening mechanism in order to reduce the fuzzy rule search space. Candidate rules obtained after pre-screening are used in genetic fuzzy classifier to generate rules for the specified classes. Classes are defined as Normal, PRB-probe, DOS-denial of service, U2R-user to root and R2L- remote to local. Second, an iterative rule learning mechanism is employed for each class to find its fuzzy rules required to classify data each time a fuzzy rule is extracted and included in the system. A Boosting mechanism evaluates the weight of each data item in order to help the rule extraction mechanism focus more on data having relatively higher weight. Finally, extracted fuzzy rules having the corresponding weight values are aggregated on class basis to find the vote of each class label for each data item.

Recent estimates of bioenergy induced land use land cover change (LULCC) have large uncertainty due to misclassification errors in the LULC datasets used for analysis. These uncertainties are further compounded when data is modified by merging classes, aggregating pixels and change in classification methods over time. Hence the LULCC computed using these derived datasets is more a reflection of change in classification methods, change in input data and data manipulation rather than reflecting actual changes ion ground. Furthermore results are constrained by geographic extent, update frequency and resolution of the dataset. To overcome this limitation we have developed a change detection system to identify yearly as well as seasonal changes in LULC patterns. Our method uses hierarchical clustering which works by grouping objects into a hierarchy based on phenological similarity of different vegetation types. The algorithm explicitly models vegetation phenology to reduce spurious changes. We apply our technique on globally available Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI data at 250-meter resolution. We analyze 10 years of bi-weekly data to predict changes in the mid-western US as a case study. The results of our analysis are presented and its advantages over existing techniques are discussed.

Full Text Available The kidney is a very important complicated filtering organ of the body. When the kidney reaches stage 5 chronic kidney disease, end stage renal failure, the preeminent therapy is renal transplantation. Although it is the best form of treatment, lack of kidney donors is still challenging. Therefore, all efforts should be employed to prolong the survival rate of the transplanted kidney. However, graft dysfunction (e.g., acute rejection is one of the serious barriers to long term kidney transplant survival. Currently, graft dysfunction’s gold standard of diagnosis is renal biopsy. Although renal biopsy is helpful, it is not preferred due to its invasive nature, high morbidity rates, and expensiveness. Therefore, noninvasive imaging techniques have become the subject of extensive research and interest, giving a strong promise to replace, or at least to decrease, biopsy usage in diagnosing graft dysfunction. This survey will discuss not only the current diagnosis and treatment of graft dysfunction but also the state-of-the-art imaging techniques in detecting acute renal transplant rejection.

The structural characteristic deflection shapes (CDS’s) such as mode shapes and operational deflection shapes are highly sensitive to structural damage in beam- or plate-type structures. Nevertheless, they are vulnerable to measurement noise and could result in unacceptable identification errors. In order to increase the accuracy and noise robustness of damage identification based on CDS’s using vibration responses of random excitation, joint approximate diagonalization (JAD) technique and gapped smoothing method (GSM) are combined to form a sensitive and robust damage index (DI), which can simultaneously detect the existence of damage and localize its position. In addition, it is possible to apply this approach to damage identification of structures under ambient excitation. First, JAD method which is an essential technique of blind source separation is investigated to simultaneously diagonalize a set of power spectral density matrices corresponding to frequencies near a certain natural frequency to estimate a joint unitary diagonalizer. The columns of this joint diagonalizer contain dominant CDS’s. With the identified dominant CDS’s around different natural frequencies, GSM is used to extract damage features and a robust damage identification index is then proposed. Numerical and experimental examples of beams with cracks are used to verify the validity and noise robustness of JAD based CDS estimation and the proposed DI. Furthermore, damage identification using dominant CDS’s estimated by JAD method is demonstrated to be more accurate and noise robust than by the commonly used singular value decomposition method.

Full Text Available In the present study SAR interferometric techniques (stacking of conventional interferograms and Permanent Scatterers, using images from satellites ERS-1 and 2, have been applied to the region of Thessaloniki (northern Greece. The period covered by the images is 1992–2000. Both techniques gave good quantitative and qualitative results. The interferometric products were used to study ground surface deformation phenomena that could be related to the local tectonic context, the exploitation of underground water and sediments compaction.

The city of Thessaloniki shows relatively stable ground conditions. Subsidence in four locations, mainly in the area surrounding the city of Thessaloniki, has been detected and assessed. Two of the sites (Sindos-Kalochori and Langadhas were already known from previous studies as subsiding areas, using ground base measurements. On the contrary the other two sites in the northern suburbs of Thessaloniki (Oreokastro and in the south-east (airport area were unknown as areas of subsidence. A further investigation based on fieldwork is needed in these two areas. Finally, an attempt to interpret the observed deformation, according to the geological regime of the area and its anthropogenic activities, has been carried out.

In the optical quality measuring process of an optical system, including diamond-turning components, the use of a laser light source can produce an undesirable speckle effect in a Shack-Hartmann (SH) CCD sensor. This speckle noise can deteriorate the precision and accuracy of the wavefront sensor measurement. Here we present a SH centroid detection method founded on computer-based techniques and capable of measurement in the presence of strong speckle noise. The method extends the dynamic range imaging capabilities of the SH sensor through the use of a set of different CCD integration times. The resultant extended range spot map is normalized to accurately obtain the spot centroids. The proposed method has been applied to measure the optical quality of the main optical system (MOS) of the mid-infrared instrument telescope smulator. The wavefront at the exit of this optical system is affected by speckle noise when it is illuminated by a laser source and by air turbulence because it has a long back focal length (3017 mm). Using the proposed technique, the MOS wavefront error was measured and satisfactory results were obtained.

Isolating methanogenic archea from an extreme environment such as El Tatio (high altitude, arid climate) gives insight to the methanogenic taxas able to adapt and grow under extreme conditions. The hydrothermal waters at El Tatio geyser field demonstrate extreme geochemical conditions, with discharge water from springs and geysers at local boiling temperature (85° C) with high levels of arsenic and low DIC levels. Despite these challenges, many of El Tatio’s hundred plus hydrothermal features host extensive microbial mat communities, many showing evidence of methanogenesis. When trying to isolate methanogens unique to this area, various approaches and techniques were used. To detect the presence of methanogens in samples taken from the field, dissolved methane concentrations were determined via gas chromatography (GC) analysis. Samples were then selected for culturing and most probable number (MPN) enumeration, where growth was assessed using both methane production and observations of fluorescence under UV light. PCR was used to see if the archeal DNA was apparent directly from the field, and shotgun cloning was done to determine phylogenetic affiliation. Several culturing techniques were carried out in an attempt to isolate methanogens from samples that showed evidence of methanogenesis. The slant culturing method was used because of the increased surface area for colonization combined with the relative ease of keeping anaerobic. After a few weeks, when colonies were apparent, some were aseptically selected and inoculated to observe growth in a liquid media containing ampicillin to inhibit bacterial growth. Culturing techniques proved successful after inoculation, showing a slow growth of methanogens via GC and autofluorescence. Further PCR tests and subsequent sequencing were done to confirm and identify isolates.

Concrete-encased composite structure is a type of structure that takes the advantages of both steel and concrete materials, showing improved strength, ductility, and fire resistance compared to traditional reinforced concrete structures. The interface between concrete and steel profiles governs the interaction between these two materials under loading, however, debonding damage between these two materials may lead to severe degradation of the load transferring capacity which will affect the structural performance significantly. In this paper, the electro-mechanical impedance (EMI) technique using piezoceramic transducers was experimentally investigated to detect the bond-slip occurrence of the concrete-encased composite structure. The root-mean-square deviation is used to quantify the variations of the impedance signatures due to the presence of the bond-slip damage. In order to verify the validity of the proposed method, finite element model analysis was performed to simulate the behavior of concrete-steel debonding based on a 3D finite element concrete-steel bond model. The computed impedance signatures from the numerical results are compared with the results obtained from the experimental study, and both the numerical and experimental studies verify the proposed EMI method to detect bond slip of a concrete-encased composite structure.