The application of artificial intelligence principles to the processing of radar signals is considered theoretically. The main capabilities required are learning and adaptation in a changing environment, processing and modeling information (especially dynamics and uncertainty), and decision-making based on all available information (taking its reliability into account). For the application to combat-aircraft radar systems, the tasks include the combination of data from different types of sensors, reacting to electronic counter-countermeasures, evaluation of how much data should be acquired (energy and radiation management), control of the radar, tracking, and identification. Also discussed are related uses such as monitoring the avionics systems, supporting pilot decisions with respect to the radar system, and general applications in radar-system R&D.

Identifies types and distributions of errors in text produced by optical character recognition (OCR) and proposes a process using machine learning techniques to recognize and correct errors in OCR texts. Results of experiments indicating that this strategy can reduce human interaction required for error correction are reported. (25 references)…

The Business Intelligence technology, which represents a strong tool not only for decision making support, but also has a big potential in other fields of application, is discussed in this paper. Necessary fundamental definitions are offered and explained to better understand the basic principles and the role of this technology for company management. Article is logically divided into five main parts. In the first part, there is the definition of the technology and the list of main advantages. In the second part, an overview of the system architecture with the brief description of separate building blocks is presented. Also, the hierarchical nature of the system architecture is shown. The technology life cycle consisting of four steps, which are mutually interconnected into a ring, is described in the third part. In the fourth part, analytical methods incorporated in the online analytical processing and data mining used within the business intelligence as well as the related data mining methodologies are summarised. Also, some typical applications of the above-mentioned particular methods are introduced. In the final part, a proposal of the knowledge discovery system for hierarchical process control is outlined. The focus of this paper is to provide a comprehensive view and to familiarize the reader with the Business Intelligence technology and its utilisation.

including Open Source ( OSINT ), in the production of intelligence. All-source intelligence is both a separate intelligence discipline and the name of the...and signature intelligence (MASINT), Technical intelligence (TECHINT), Open source intelligence ( OSINT ), and Biometric intelligence (BIOINT)) to...characterizing the processes for each discipline (HUMINT, SIGINT, OSINT , IMINT, GEOINT, etc.). Then, we analyzed the All-Source activities and processes

Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligentprocesses and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

The Advanced Research and Development Activity initiated the Novel Intelligence from Massive Data (NIMD) program to develop advanced analytic technologies and methodologies. In order to support this objective, researchers and developers need to understand what analysts do and how they do it. In the past, this knowledge generally was acquired through subjective feedback from analysts. NIMD established the innovative Glass Box Analysis (GBA) Project to instrument a live intelligence mission and unobtrusively capture and objectively study the analysis process. Instrumenting the analysis process requires tailor-made software hooks that grab data from a myriad of disparate application operations and feed into a complex relational database and hierarchical file store to collect, store, retrieve, and distribute analytic data in a manner that maximizes researchers’ understanding. A key to success is determining the correct data to collect and aggregate low-level data into meaningful analytic events. This paper will examine how the GBA team solved some of these challenges, continues to address others, and supports a growing user community in establishing their own GBA environments and/or studying the data generated by GBA analysts working in the Glass Box.

We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.

Forensic science is generally defined as the application of science to address questions related to the law. Too often, this view restricts the contribution of science to one single process which eventually aims at bringing individuals to court while minimising risk of miscarriage of justice. In order to go beyond this paradigm, we propose to refocus the attention towards traces themselves, as remnants of a criminal activity, and their information content. We postulate that traces contribute effectively to a wide variety of other informational processes that support decision making in many situations. In particular, they inform actors of new policing strategies who place the treatment of information and intelligence at the centre of their systems. This contribution of forensic science to these security oriented models is still not well identified and captured. In order to create the best condition for the development of forensic intelligence, we suggest a framework that connects forensic science to intelligence-led policing (part I). Crime scene attendance and processing can be envisaged within this view. This approach gives indications about how to structure knowledge used by crime scene examiners in their effective practice (part II).

Research and development projects involving intelligentprocessing equipment within the following U.S. agencies are addressed: Department of Agriculture, Department of Commerce, Department of Energy, Department of Defense, Environmental Protection Agency, Federal Emergency Management Agency, NASA, National Institutes of Health, and the National Science Foundation.

Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

The aim of this study is to find out the effect of the course materials based on Multiple Intelligence Theory upon the intelligence groups' learning process. In conclusion, the results proved that the materials prepared according to Multiple Intelligence Theory have a considerable effect on the students' learning process. This effect was particularly seen on the student groups of the musical-rhythmic, verbal-linguistic, interpersonal-social and naturalist intelligence.

Clinicalintelligence (CI) is an emerging field in health care arising from the proliferation of electronic data being generated through the increasing use of healthcare information technology (HIT). It is defined as the electronic aggregation of accurate, relevant and timely data into meaningful information and actionable knowledge in order to achieve optimal healthcare structures, processes and outcomes. Clinicalintelligence will dramatically change the discipline of nursing informatics in the future. This presentation describes the role of nurse informaticists (NI) today in building clinicalintelligence, including communicating the vision, designing systems, educating clinicians, preparing themselves, and supporting research. PMID:24199076

The Defense Logistics Agency is successfully incorporating IntelligentProcessing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

The Defense Logistics Agency is successfully incorporating IntelligentProcessing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

A reproducible and transparent quality of clinical treatments plays an important role in the performance of a hospital. In liver transplantation (LT), this is particularly important for patient safety, resource planning, documentation, and quality management. Thus, the clinical pathway for LT was documented in an electronic format within our research project PIGE. Data from clinical information systems were linked to this pathway, which allows for process monitoring (the assessment of the current state for every patient in the LT process) and a retrospective analysis of all treatments in addition to all data pertaining to the treatment, for example, cost, time, number of personnel, etc.

The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.

The defence technologies which have been developing and changing rapidly, today make it difficult to be able to foresee the next environment and spectrum of warfare. When said change and development is looked in specific to the naval operations, it can be said that the possible battlefield and scenarios to be developed in the near and middle terms (5-20 years) are more clarified with compare to other force components. Network Centric Naval Warfare Concept that was developed for the floating, diving and flying fleet platforms which serves away from its own mainland for miles, will keep its significance in the future. Accordingly, Network Centric Intelligence structure completely integrating with the command and control systems will have relatively more importance. This study will firstly try to figure out the transition from the traditional intelligence cycle that is still used in conventional war to Network Centric Intelligence Production Process. In the last part, the use of this new approach on the base of UAV that is alternative to satellite based command control and data transfer systems in the joint operations in narrow seas will be examined, a model suggestion for the use of operative and strategic UAVs which are assured within the scope of the NATO AGS2 for this aim will be brought.

In order to improve the compatibility, security and expandability of Automatic Fare Collection System in rail transit, and reduce the maintenance cost, intelligent card processing terminal is proposed in this paper. The operation flow and features of intelligent card processing terminal are analyzed in detailed, and the software and hardware structures and business treatment process are designed. Finally, the security mechanism of intelligent card processing terminal is summarized. The application results shows that Intelligent card processing terminal makes interconnection among lines easier, creates considerable economic efficiency and the social efficiency, and can be widely used.

This book deals with the basis for design of intelligent systems to support human decision-making in supervisory control, and provides a view of how human and artificial cognitive systems can interact. It covers the design and development of intelligent decision aiding systems, as well as the testing and evaluation. Topics discussed include: decision theory; cognitive engineering; systems engineering; and artificial intelligence.

Our approach to onboard processing will enable a quicker return and improved quality of processed data from small, remote-sensing satellites. We describe an intelligent payload concept which processes RF lightning signal data onboard the spacecraft in a power-aware manner. Presently, onboard processing is severely curtailed due to the conventional management of limited resources and power-unaware payload designs. Delays of days to weeks are commonly experienced before raw data is received, processed into a human-usable format, and finally transmitted to the end-user. We enable this resource-critical technology of onboard processing through the concept of Algorithm Power Modulation (APM). APM is a decision process used to execute a specific software algorithm, from a suite of possible algorithms, to make the best use of the available power. The suite of software algorithms chosen for our application is intended to reduce the probability of false alarms through postprocessing. Each algorithm however also has a cost in energy usage. A heuristic decision tree procedure is used which selects an algorithm based on the available power, time allocated, algorithm priority, and algorithm performance. We demonstrate our approach to power-aware onboard processing through a preliminary software simulation.

The application of Artificial Intelligence to electronic and process control can help attain the autonomy and safety requirements of manned space systems. An overview of documented applications within various industries is presented. The development process is discussed along with associated issues for implementing an intelligenceprocess control system.

Chemical Industries such as resin or soap manufacturing industries have reaction systems which work with at least two chemicals. Mixing of chemicals even at room temperature can create the process of exothermic reaction. This processes produces a sudden increase of heat energy within the mixture. The quantity of heat and the dynamics of heat generation are unknown, unpredictable and time varying. Proper control of heat has to be accomplished in order to achieve a high quality of product. Uncontrolled or poorly controlled heat causes another unusable product and the process may damage materials and systems and even human being may be harmed. Controlling of heat due to exothermic reaction cannot be achieved using conventional control methods such as PID control, identification and control etc. All of the conventional methods require at least approximate mathematical model of the exothermic process. Modeling an exothermal process is yet to be properly conceived. This paper discusses a design methodology for controlling such a process. A pilot plant of a reaction system has been constructed and utilized for designing and incorporating the proposed fuzzy logic based intelligent controller. Both the conventional and then an adaptive form of fuzzy logic control were used in testing the performance. The test results ensure the effectiveness of controllers in controlling exothermic heat.

Materials science and engineering provides a vast arena for applications of artificial intelligence. Advanced materials research is an area in which challenging requirements confront the researcher, from the drawing board through production and into service. Advanced techniques results in the development of new materials for specialized applications. Hand-in-hand with these new materials are also requirements for state-of-the-art inspection methods to determine the integrity or fitness for service of structures fabricated from these materials. Two problems of current interest to the Materials Processing Laboratory at UAH are an expert system to assist in eddy current inspection of graphite epoxy components for aerospace and an expert system to assist in the design of superalloys for high temperature applications. Each project requires a different approach to reach the defined goals. Results to date are described for the eddy current analysis, but only the original concepts and approaches considered are given for the expert system to design superalloys.

According to the Healthcare Information Management and Systems Society, "Clinical & Business Intelligence (C&BI) is the use and analysis of data captured in the healthcare setting to directly inform decision-making" (http://www.himss.org/library/clinical-business-intelligence). Some say that it is the right information given to the right person at the right time in the right way. No matter how you define it, the fact remains that timely access, synthesis, and visualization of clinical data have become key to how health professionals make patient care decisions and improve care delivery.

Medicine is one of the fields of knowledge that would most benefit from a closer interaction with Computer studies and Mathematics by optimizing complex, imperfect processes such as differential diagnosis; this is the domain of Machine Learning, a branch of Artificial Intelligence that builds and studies systems capable of learning from a set of training data, in order to optimize classification and prediction processes. In Mexico during the last few years, progress has been made on the implementation of electronic clinical records, so that the National Institutes of Health already have accumulated a wealth of stored data. For those data to become knowledge, they need to be processed and analyzed through complex statistical methods, as it is already being done in other countries, employing: case-based reasoning, artificial neural networks, Bayesian classifiers, multivariate logistic regression, or support vector machines, among other methodologies; to assist the clinical diagnosis of acute appendicitis, breast cancer and chronic liver disease, among a wide array of maladies. In this review we shift through concepts, antecedents, current examples and methodologies of machine learning-assisted clinical diagnosis.

Discusses the use of artificial intelligence systems in process engineering. Describes a new program at the Massachusetts Institute of Technology which attempts to advance process engineering through technological advances in the areas of artificial intelligence and computers. Identifies the program's hardware facilities, software support,…

The Processing Speed Index (PSI) was first introduced on the Wechsler Intelligence Scale, Third Edition (WISC-III; D. Wechsler, 1991), and little is known about its clinical significance. In a referred sample (N = 980), children with neurological disorders (ADHD, autism, bipolar disorder, and LD) had mean PSI and Freedom from Distractibility Index…

Emotional intelligence as conceptualized by Mayer and Salovey consists of perceiving emotions, using emotions to facilitate thoughts, understanding emotions, and managing emotions to enhance personal growth. The Multifactor Emotional Intelligence Scale has proven a valid and reliable measure that can be used to explore the implications of…

The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

Protection of the environment and environmental remediation requires the cooperation, at all levels, of government and industry. Intelligentprocessing equipment, in addition to other artificial intelligence based tools, was used by the Environmental Protection Agency to provide personnel safety and improve the efficiency of those responsible for protection and remediation of the environment. These exploratory efforts demonstrate the feasibility and utility of expanding development and widespread use of these tools. A survey of current intelligentprocessing equipment applications in the Agency is presented and is followed by a brief discussion of possible uses in the future.

This document addresses two issues in the original paper entitled 'An Intelligent, Onboard Signal Processing Payload Concept' submitted to the SPIE AeroSense 2003 C0nference.l Since the original paper submission, and prior to the scheduled presentation, a correction has been made to one of the figures in the original paper and an update has been performed to the software simulation of the payload concept. The figure, referred to as Figure 8. Simulation Results in the original paper, contains an error in the voltage versus the capacity drained chart. This chart does not correctly display the voltage changes experienced by the battery module due to the varying discharge rates. This error is an artifact of the procedure used to graph the data. Additionally, the original version of the Simulation related the algorithm execution rate to the lightning event rate regardless of the number of events in the ring buffer. This feature was mentioned in section 5. Simulation Results of the original paper. A correction was also made to the size of the ring buffer. Incorrect information was provided to the authors that placed the number of possible events at 18,310. Corrected information has since been obtained that specifies the ring buffer can typically hold only 1,000 events. This has a significant impact on the APM process and the number of events lost when the size of the ring buffer is exceeded. Also, upon further analysis, it was realized that the simulation contained an error in the recording of the number of events in the ring buffer. The faster algorithms, LMS and ML, should have been able to process all events during the simulation time interval, but the initial results did not reflect this characteristic. The updated version of the simulation appropriately handles the number of algorithm executions and recording of events in the ring buffer as well as uses the correct size for the ring buffer. These improvements to the simulation and subsequent results are discussed in

Computer-aided diagnosis (CAD) is rapidly entering the radiology mainstream. It has already become a part of the routine clinical work for the detection of breast cancer with mammograms. The computer output is used as a "second opinion" in assisting radiologists' image interpretations. The computer algorithm generally consists of several steps that may include image processing, image feature analysis, and data classification via the use of tools such as artificial neural networks (ANN). In this article, we will explore these and other current processes that have come to be referred to as "artificial intelligence." One element of CAD, temporal subtraction, has been applied for enhancing interval changes and for suppressing unchanged structures (eg, normal structures) between 2 successive radiologic images. To reduce misregistration artifacts on the temporal subtraction images, a nonlinear image warping technique for matching the previous image to the current one has been developed. Development of the temporal subtraction method originated with chest radiographs, with the method subsequently being applied to chest computed tomography (CT) and nuclear medicine bone scans. The usefulness of the temporal subtraction method for bone scans was demonstrated by an observer study in which reading times and diagnostic accuracy improved significantly. An additional prospective clinical study verified that the temporal subtraction image could be used as a "second opinion" by radiologists with negligible detrimental effects. ANN was first used in 1990 for computerized differential diagnosis of interstitial lung diseases in CAD. Since then, ANN has been widely used in CAD schemes for the detection and diagnosis of various diseases in different imaging modalities, including the differential diagnosis of lung nodules and interstitial lung diseases in chest radiography, CT, and position emission tomography/CT. It is likely that CAD will be integrated into picture archiving and

modelled as zero mean Gaussian noise. The state estimate provided by a Kalman filter is statistically optimal in that it minimizes the mean squared error...the fusion methods above contain logical and mathematical algorithms based on either continuous or discrete quantifiable data, so to use these methods...method for capturing statistics about the performance of different architectures, it fails to capture the synergy of intelligence or information fusion

Studies of the causes of cancer, early detection, prevention or treatment need accurate, comprehensive, and timely cancer data. The clinical laboratory provides important cancer information needed for physicians which influence clinical decisions regarding treatment, diagnosis and patient monitoring. Poor communication between health care providers and clinical laboratory personnel can lead to medical errors and wrong decisions in providing cancer care. Because of the key impact of laboratory information on cancer diagnosis and treatment the quality of the tests, lab reports, and appropriate lab management are very important. A laboratory information management system (LIMS) can have an important role in diagnosis, fast and effective access to cancer data, decrease redundancy and costs, and facilitate the integration and collection of data from different types of instruments and systems. In spite of significant advantages LIMS is limited by factors such as problems in adaption to new instruments that may change existing work processes. Applications of intelligent software simultaneously with existing information systems, in addition to remove these restrictions, have important benefits including adding additional non-laboratory-generated information to the reports, facilitating decision making, and improving quality and productivity of cancer care services. Laboratory systems must have flexibility to change and have the capability to develop and benefit from intelligent devices. Intelligent laboratory information management systems need to benefit from informatics tools and latest technologies like open sources. The aim of this commentary is to survey application, opportunities and necessity of intelligentclinical laboratory as a tool to increase cancer care management productivity.

Objective: This exploratory, quantitative, descriptive study was undertaken to explore the relationship between clinical performance and anticipated retention in nursing students. Methods: After approval by the university's Human Subjects Committee, a sample of 104 nursing students were recruited for this study, which involved testing with a valid and reliable emotional intelligence (EI) instrument and a self-report survey of clinical competencies. Results: Statistical analysis revealed that although the group average for total EI score and the 6 score subsets were in the average range, approximately 30% of the individual total EI scores and 30% of two branch scores, identifying emotions correctly and understanding emotions, fell in the less than average range. This data, as well as the analysis of correlation with clinical self-report scores, suggest recommendations applicable to educators of clinical nursing students. Conclusions: Registered nurses make-up the largest segment of the ever-growing healthcare workforce. Yet, retention of new graduates has historically been a challenge for the profession. Given the projected employment growth in nursing, it is important to identify factors which correlate with high levels of performance and job retention among nurses. There is preliminary evidence that EI a nontraditional intelligence measure relates positively not only with retention of clinical staff nurses, but with overall clinical performance as well. PMID:27981096

The purpose of this grant was to develop frequency dependent dielectric measurements, often called FDEMS (frequency dependent electromagnetic sensing), to monitor and intelligently control the cure process in PMR-15, a stoichiometric mixture of a nadic ester, dimethyl ester, and methylendianiline in a monomor ratio.

The field of intelligence testing has been revolutionized by Alan S. Kaufman. He developed the Wechsler Intelligence Scale for Children-Revised (WISC-R) with David Wechsler, and his best-selling book, Intelligent Testing with the WISC-R, introduced the phrase "intelligent testing." Kaufman, with his wife, Nadeen, then created his own…

The development of a highly efficient and thus truly intelligentprocessing environment for real-time general purpose simulation of continuous systems is described. Such an environment can be created by mapping the simulation process directly onto the University of Alamba's OPERA architecture. To facilitate this effort, the field of continuous simulation is explored, highlighting areas in which efficiency can be improved. Areas in which parallel processing can be applied are also identified, and several general OPERA type hardware configurations that support improved simulation are investigated. Three direct execution parallel processing environments are introduced, each of which greatly improves efficiency by exploiting distinct areas of the simulation process. These suggested environments are candidate architectures around which a highly intelligent real-time simulation configuration can be developed.

The relationship between intelligence quotient (IQ) and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI) in individuals' cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed.

The relationship between intelligence quotient (IQ) and cognitive control processes has been extensively established. Several studies have shown that IQ correlates with cognitive control abilities, such as interference suppression, as measured with experimental tasks like the Stroop and Flanker tasks. By contrast, there is a debate about the role of Emotional Intelligence (EI) in individuals' cognitive control abilities. The aim of this study is to examine the relation between IQ and EI, and cognitive control abilities evaluated by a typical laboratory control cognitive task, the Stroop task. Results show a negative correlation between IQ and the interference suppression index, the ability to inhibit processing of irrelevant information. However, the Managing Emotions dimension of EI measured by the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), but not self-reported of EI, negatively correlates with the impulsivity index, the premature execution of the response. These results suggest that not only is IQ crucial, but also competences related to EI are essential to human cognitive control processes. Limitations and implications of these results are also discussed. PMID:26648901

The article describes the processing of information obtained from sensors in intelligent systems. The paper analyzes the need of advanced treatment for a paralleling operation calculator which reduces the time of response to input events. A realization of a speculative processing algorithm in the FPGA by streaming control is based on a data flow model. This solution can be used in applications related to telecommunications networks of distributed control systems.

A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.

Background: Emotional intelligence (EI) helps humans to perceive their own and others’ emotions. It helps to make better interpersonal communication that consequently leads to an increase in everyday performance and professional career. Teaching, particularly teaching in the clinical environment, is among the professions that need a high level of EI due to its relevance to human interactions. Materials and Methods: We adopted EI competencies with characteristics of a good clinical teacher. As a result, we extracted 12 strategies and then reviewed the literatures relevant to these strategies. Results: In the present article, 12 strategies that a clinical teacher should follow to use EI in her/his teaching were described. Conclusion: To apply EI in clinical settings, a teacher should consider all the factors that can bring about a more positive emotional environment and social interactions. These factors will increase students’ learning, improve patients’ care, and maintain her/his well-being. In addition, he/she will be able to evaluate her/his teaching to improve its effectiveness. PMID:27904573

An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation

An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

In the modern era, the increase in the number of shopping malls and industrial building has led to an exponential increase in the usage of elevator systems. Thus there is an increased need for an effective control system to manage the elevator system. This paper is aimed at introducing an effective method to control the movement of the elevators by considering various cases where in the location of the person is found and the elevators are controlled based on various conditions like Load, proximity etc... This method continuously monitors the weight limit of each elevator while also making use of image processing to determine the number of persons waiting for an elevator in respective floors. Canny edge detection technique is used to find out the number of persons waiting for an elevator. Hence the algorithm takes a lot of cases into account and locates the correct elevator to service the respective persons waiting in different floors.

Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity) and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing [wide-dynamic range compression (WDRC) and frequency compression (FC)] applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than to the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with WDRC and/or FC parameters that maintain the fidelity of the signal envelope.

Previous work suggested that individuals with low working memory capacity may be at a disadvantage in adverse listening environments, including situations with background noise or substantial modification of the acoustic signal. This study explored the relationship between patient factors (including working memory capacity) and intelligibility and quality of modified speech for older individuals with sensorineural hearing loss. The modification was created using a combination of hearing aid processing [wide-dynamic range compression (WDRC) and frequency compression (FC)] applied to sentences in multitalker babble. The extent of signal modification was quantified via an envelope fidelity index. We also explored the contribution of components of working memory by including measures of processing speed and executive function. We hypothesized that listeners with low working memory capacity would perform more poorly than those with high working memory capacity across all situations, and would also be differentially affected by high amounts of signal modification. Results showed a significant effect of working memory capacity for speech intelligibility, and an interaction between working memory, amount of hearing loss and signal modification. Signal modification was the major predictor of quality ratings. These data add to the literature on hearing-aid processing and working memory by suggesting that the working memory-intelligibility effects may be related to aggregate signal fidelity, rather than to the specific signal manipulation. They also suggest that for individuals with low working memory capacity, sensorineural loss may be most appropriately addressed with WDRC and/or FC parameters that maintain the fidelity of the signal envelope. PMID:25999874

Petabytes of remote sensing data are now available from Earth-observing satellites to help measure, understand and forecast changes in the Earth system, but using these data effectively can be surprisingly hard. The volume and variety of data files and formats are daunting. Simple data management activities, such as locating and transferring files, changing file formats, gridding point data, and scaling and reprojecting gridded data, can consume far more personnel time and resources than the actual data analysis. We address this problem by developing a planner-based agent for data production, called IMAGEbot, that takes data product requests as high-level goals and executes the commands needed to produce the requested data products. IMAGEbot is based on automated constraint-based planning and a flexible component-based architecture. Unlike more traditional approaches, where the instruction sequences for managing and processing data are hand-coded; in our agent-based approach, the instruction sequences are automatically generated based on user requests and available data sources. New data sources, models or data-processing programs can be added in a plug-and-play fashion, and the planner can adapt to errors or data dropouts by trying alternative ways of achieving the same goal, such as using other, possibly lesser quality, data sources. We have demonstrated this technology in the Terrestrial Observation and Prediction System (TOPS), an ecological forecasting system that assimilates data from Earth-orbiting satellites and ground weather stations to model and forecast conditions on the surface, such as soil moisture, vegetation growth and plant stress. The planner identifies the appropriate input files and sequences of operations needed to satisfy a data request, executes those operations on a remote TOPS server, and displays the results, quickly and reliably. Whereas TOPS is concerned with geospatial data measuring specific variables of the Earth system, such as

The performance of 100 patients with traumatic brain injury (TBI) on the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) was compared with that of 100 demographically matched neurologically healthy controls. Processing Speed was the only WAIS-IV factor index that was able to discriminate between persons with moderate-severe TBI on the one hand and persons with either less severe TBI or neurologically healthy controls on the other hand. The Processing Speed index also had acceptable sensitivity and specificity when differentiating between patients with TBI who either did or did not have scores in the clinically significant range on the Trail Making Test. It is concluded that WAIS-IV Processing Speed has acceptable clinical utility in the evaluation of patients with moderate-severe TBI but that it should be supplemented with other measures to assure sufficient accuracy in the diagnostic process.

Forensic intelligence has recently gathered increasing attention as a potential expansion of forensic science that may contribute in a wider policing and security context. Whilst the new avenue is certainly promising, relatively few attempts to incorporate models, methods and techniques into practical projects are reported. This work reports a practical application of a generalised and transversal framework for developing forensic intelligenceprocesses referred to here as the Transversal model adapted from previous work. Visual features present in the images of four datasets of false identity documents were systematically profiled and compared using image processing for the detection of a series of modus operandi (M.O.) actions. The nature of these series and their relation to the notion of common source was evaluated with respect to alternative known information and inferences drawn regarding respective crime systems. 439 documents seized by police and border guard authorities across 10 jurisdictions in Switzerland with known and unknown source level links formed the datasets for this study. Training sets were developed based on both known source level data, and visually supported relationships. Performance was evaluated through the use of intra-variability and inter-variability scores drawn from over 48,000 comparisons. The optimised method exhibited significant sensitivity combined with strong specificity and demonstrates its ability to support forensic intelligence efforts.

Intelligent setting system based on biomechanics and bone fracture therapy can accomplish micro-wound, intelligence and high efficiency of fracture setting. X-ray images grabbed by C-shape-arm X-ray machine supply the most key data for intelligent setting. Processing, analysis and transmission security of the image is the core in the system. According to characteristics being shown in three dimensions gray distribution figure and frequency spectrum of the image, histogram equalization in space domain and homomorphic filtering in frequency domain are separately proposed to enhance contrast and sharpness. On the foundation of mining orthopedics experts experience knowledge, setting for femoral-neck fracture is turned into three in-continuous operations that are reflected in the X-ray images through nine points, six lines, two angles and one distance and that are able to be implemented by mechanical manipulator and control device in the system. Master-slave reference frame is put forward to supply a stable reference standard to calculate parameters. Encryption method based on chaos dynamics system is brought forward to ensure image information security in the process of telemedicine intelligent setting for fracture. Clinic experience proved that the system can help orthopedists to correctly and reliably complete setting for bone fracture.

Most existing sensing systems are designed as passive, independent observers. They are rarely aware of the phenomena they observe, and are even less likely to be aware of what other sensors are observing within the same environment. Increasingly, intelligentprocessing of sensor data is taking place in real-time, using computing resources on-board the sensor or the platform itself. One can imagine a sensor network consisting of intelligent and autonomous space-borne, airborne, and ground-based sensors. These sensors will act independently of one another, yet each will be capable of both publishing and receiving sensor information, observations, and alerts among other sensors in the network. Furthermore, these sensors will be capable of acting upon this information, perhaps altering acquisition properties of their instruments, changing the location of their platform, or updating processing strategies for their own observations to provide responsive information or additional alerts. Such autonomous and intelligent sensor networking capabilities provide significant benefits for collections of heterogeneous sensors within any environment. They are crucial for multi-sensor observations and surveillance, where real-time communication with external components and users may be inhibited, and the environment may be hostile. In all environments, mission automation and communication capabilities among disparate sensors will enable quicker response to interesting, rare, or unexpected events. Additionally, an intelligent network of heterogeneous sensors provides the advantage that all of the sensors can benefit from the unique capabilities of each sensor in the network. The University of Alabama in Huntsville (UAH) is developing a unique approach to data processing, integration and mining through the use of the Adaptive On-Board Data Processing (AODP) framework. AODP is a key foundation technology for autonomous internetworking capabilities to support situational awareness by

To address the issues of excessive inventories and increasing procurement lead times, the Navy is actively pursuing flexible computer integrated manufacturing (FCIM) technologies, integrated by communication networks to respond rapidly to its requirements for parts. The Rapid Acquisition of Manufactured Parts (RAMP) program, initiated in 1986, is an integral part of this effort. The RAMP program's goal is to reduce the current average production lead times experienced by the Navy's inventory control points by a factor of 90 percent. The manufacturing engineering component of the RAMP architecture utilizes an intelligentprocessing technology built around a knowledge-based shell provided by ICAD, Inc. Rules and data bases in the software simulate an expert manufacturing planner's knowledge of shop processes and equipment. This expert system can use Product Data Exchange using STEP (PDES) data to determine what features the required part has, what material is required to manufacture it, what machines and tools are needed, and how the part should be held (fixtured) for machining, among other factors. The program's rule base then indicates, for example, how to make each feature, in what order to make it, and to which machines on the shop floor the part should be routed for processing. This information becomes part of the shop work order. The process planning function under RAMP greatly reduces the time and effort required to complete a process plan. Since the PDES file that drives the intelligentprocessing is 100 percent complete and accurate to start with, the potential for costly errors is greatly diminished.

Aim: To survey person centered survival rate in population based screening program by an intelligentclinical decision support system. Background: Colorectal cancer is the most common malignancy and major cause of morbidity and mortality throughout the world. Colorectal cancer is the sixth leading cause of cancer death in Iran. In this survey, we used cosine similarity as data mining technique and intelligent system for estimating survival of at risk groups in the screening plan. Methods: In the first step, we determined minimum data set (MDS). MDS was approved by experts and reviewing literatures. In the second step, MDS were coded by python language and matched with cosine similarity formula. Finally, survival rate by percent was illustrated in the user interface of national intelligent system. The national intelligent system was designed in PyCharm environment. Results: Main data elements of intelligent system consist demographic information, age, referral type, risk group, recommendation and survival rate. Minimum data set related to survival comprise of clinical status, past medical history and socio-demographic information. Information of the covered population as a comprehensive database was connected to intelligent system and survival rate estimated for each patient. Mean range of survival of HNPCC patients and FAP patients were respectively 77.7% and 75.1%. Also, the mean range of the survival rate and other calculations have changed with the entry of new patients in the CRC registry by real-time. Conclusion: National intelligent system monitors the entire of risk group and reports survival rates by electronic guidelines and data mining technique and also operates according to the clinicalprocess. This web base software has a critical role in the estimation survival rate in order to health care planning. PMID:28331566

In this article, the author provides a framework to guide research in emotional intelligence. Studies conducted up to the present bear on a conception of emotional intelligence as pertaining to the domain of consciousness and investigate the construct with a correlational approach. As an alternative, the author explores processes underlying emotional intelligence, introducing the distinction between conscious and automatic processing as a potential source of variability in emotionally intelligent behavior. Empirical literature is reviewed to support the central hypothesis that individual differences in emotional intelligence may be best understood by considering the way individuals automatically process emotional stimuli. Providing directions for research, the author encourages the integration of experimental investigation of processes underlying emotional intelligence with correlational analysis of individual differences and fosters the exploration of the automaticity component of emotional intelligence.

The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks.This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system.In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories.It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories.

The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784

Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain-especially with regard to the functioning in the prefrontal cortex-and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret.

Intelligence is the ability to learn from experience and to adapt to, shape, and select environments. Intelligence as measured by (raw scores on) conventional standardized tests varies across the lifespan, and also across generations. Intelligence can be understood in part in terms of the biology of the brain—especially with regard to the functioning in the prefrontal cortex—and also correlates with brain size, at least within humans. Studies of the effects of genes and environment suggest that the heritability coefficient (ratio of genetic to phenotypic variation) is between .4 and .8, although heritability varies as a function of socioeconomic status and other factors. Racial differences in measured intelligence have been observed, but race is a socially constructed rather than biological variable, so such differences are difficult to interpret. PMID:22577301

The information contained in a sensory signal plays a critical role in determining what neural processes are engaged. Here we used interleaved silent steady-state (ISSS) functional magnetic resonance imaging (fMRI) to explore how human listeners cope with different degrees of acoustic richness during auditory sentence comprehension. Twenty-six healthy young adults underwent scanning while hearing sentences that varied in acoustic richness (high vs. low spectral detail) and syntactic complexity (subject-relative vs. object-relative center-embedded clause structures). We manipulated acoustic richness by presenting the stimuli as unprocessed full-spectrum speech, or noise-vocoded with 24 channels. Importantly, although the vocoded sentences were spectrally impoverished, all sentences were highly intelligible. These manipulations allowed us to test how intelligible speech processing was affected by orthogonal linguistic and acoustic demands. Acoustically rich speech showed stronger activation than acoustically less-detailed speech in a bilateral temporoparietal network with more pronounced activity in the right hemisphere. By contrast, listening to sentences with greater syntactic complexity resulted in increased activation of a left-lateralized network including left posterior lateral temporal cortex, left inferior frontal gyrus, and left dorsolateral prefrontal cortex. Significant interactions between acoustic richness and syntactic complexity occurred in left supramarginal gyrus, right superior temporal gyrus, and right inferior frontal gyrus, indicating that the regions recruited for syntactic challenge differed as a function of acoustic properties of the speech. Our findings suggest that the neural systems involved in speech perception are finely tuned to the type of information available, and that reducing the richness of the acoustic signal dramatically alters the brain's response to spoken language, even when intelligibility is high.

Ultrasound examination (US) does a key role in the diagnosis and management of the patients with clinically suspected appendicitis which is the most common abdominal surgical emergency. Among the various sonographic findings of appendicitis, outer diameter of the appendix is most important. Therefore, clear delineation of the appendix on US images is essential. In this paper, we propose a new intelligent method to extract appendix automatically from abdominal sonographic images as a basic building block of developing such an intelligent tool for medical practitioners. Knowing that the appendix is located at the lower organ area below the bottom fascia line, we conduct a series of image processing techniques to find the fascia line correctly. And then we apply fuzzy ART learning algorithm to the organ area in order to extract appendix accurately. The experiment verifies that the proposed method is highly accurate (successful in 38 out of 40 cases) in extracting appendix. PMID:26089963

The method developed by the DFKI-IFS for extending the functionality of intelligent field devices through the use of reloadable software applications (so-called Apps) is to be further augmented with a methodology and communication concept for process orchestration. The concept allows individual Apps from different manufacturers to decentrally share information. This way of communicating forms the basis for the dynamic orchestration of Apps to complete processes, in that it allows the actions of one App (e.g. detecting a component part with a sensor App) to trigger reactions in other Apps (e.g. triggering the processing of that component part). A holistic methodology and its implementation as a configuration tool allows one to model the information flow between Apps, as well as automatically introduce it into physical production hardware via available interfaces provided by the Field Device Middleware. Consequently, configuring industrial facilities is made simpler, resulting in shorter changeover and shutdown times.

The purpose of this thesis was to examine the employment and adherence of the intelligence cycle process model within the National Network of Fusion...Centers and the greater Homeland Security Enterprise by exploring the customary intelligence cycle process model established by the United States...training program that ensures consistent and clear intelligence cycle process model employment. Finally, this thesis offers an overview pertinent to

Our work has been concerned with the construction of intelligent systems for production management and control. This paper focuses on the reactive...floor, identifying deviations from predicted production schedules, and intelligent schedule repair. Keywords include: Intelligent systems and Production management .

We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data.

There has been an increasing body of research to uncover the relationship between creativity and intelligence. This relationship usually has been examined using traditional measures of intelligence and seldom using new approaches (i.e. Ferrando et al. 2005). In this work, creativity is measured by tools developed based on Sternberg's successful…

The Department of Energy (DOE) uses intelligentprocessing equipment (IPE) technologies to conduct research and development and manufacturing for energy and nuclear weapons programs. This paper highlights several significant IPE efforts underway in DOE. IPE technologies are essential to the accomplishment of DOE's missions, because of the need for small lot production, precision, and accuracy in manufacturing, hazardous waste management, and protection of the environment and the safety and health of the workforce and public. Applications of IPE technologies include environmental remediation and waste handling, advanced manufacturing, and automation of tasks carried out in hazardous areas. DOE laboratories have several key programs that integrate robotics, sensor, and control technologies. These programs embody a considerable technical capability that also may be used to enhance U.S. industrial competitiveness. DOE encourages closer cooperation with U.S. industrial partners based on mutual benefits. This paper briefly describes technology transfer mechanisms available for industrial involvement.

Darwin argued that between-species differences in intelligence were differences of degree, not of kind. The contemporary ecological approach to animal cognition argues that animals have evolved species-specific and problem-specific processes to solve problems associated with their particular ecological niches: thus different species use different processes, and within a species, different processes are used to tackle problems involving different inputs. This approach contrasts both with Darwin's view and with the general process view, according to which the same central processes of learning and memory are used across an extensive range of problems involving very different inputs. We review evidence relevant to the claim that the learning and memory performance of non-human animals varies according to the nature of the stimuli involved. We first discuss the resource distribution hypothesis, olfactory learning-set formation, and the 'biological constraints' literature, but find no convincing support from these topics for the ecological account of cognition. We then discuss the claim that the performance of birds in spatial tasks of learning and memory is superior in species that depend heavily upon stored food compared to species that either show less dependence upon stored food or do not store food. If it could be shown that storing species enjoy a superiority specifically in spatial (and not non-spatial) tasks, this would argue that spatial tasks are indeed solved using different processes from those used in non-spatial tasks. Our review of this literature does not find a consistent superiority of storing over non-storing birds in spatial tasks, and, in particular, no evidence of enhanced superiority of storing species when the task demands are increased, by, for example, increasing the number of items to be recalled or the duration of the retention period. We discuss also the observation that the hippocampus of storing birds is larger than that of non

In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

The Writing Pal is an intelligent tutoring system that provides writing strategy training. A large part of its artificial intelligence resides in the natural language processing algorithms to assess essay quality and guide feedback to students. Because writing is often highly nuanced and subjective, the development of these algorithms must consider a broad array of linguistic, rhetorical, and contextual features. This study assesses the potential for computational indices to predict human ratings of essay quality. Past studies have demonstrated that linguistic indices related to lexical diversity, word frequency, and syntactic complexity are significant predictors of human judgments of essay quality but that indices of cohesion are not. The present study extends prior work by including a larger data sample and an expanded set of indices to assess new lexical, syntactic, cohesion, rhetorical, and reading ease indices. Three models were assessed. The model reported by McNamara, Crossley, and McCarthy (Written Communication 27:57-86, 2010) including three indices of lexical diversity, word frequency, and syntactic complexity accounted for only 6% of the variance in the larger data set. A regression model including the full set of indices examined in prior studies of writing predicted 38% of the variance in human scores of essay quality with 91% adjacent accuracy (i.e., within 1 point). A regression model that also included new indices related to rhetoric and cohesion predicted 44% of the variance with 94% adjacent accuracy. The new indices increased accuracy but, more importantly, afford the means to provide more meaningful feedback in the context of a writing tutoring system.

The Java Intelligent Tutoring System (JITS) was designed and developed to support the growing trend of Java programming around the world. JITS is an advanced web-based personalized tutoring system that is unique in several ways. Most programming Intelligent Tutoring Systems require the teacher to author problems with corresponding solutions. JITS,…

The computerization of patient data is proceeding and the amount of patient records has greatly increased. However, physicians have limited time to review and process patient records. In order to use these records effectively, medical institutions need to access the information in a variety of forms. This paper comparatively describes two intelligent viewers that are SAKURA-Viewer and FUJI-Viewer. These viewers reorganize order-related data. SAKURA-Viewer is based on a concept hierarchy method and focuses the view of consolidated information. This viewer represents order history from two viewpoints simultaneously to eliminate semantic redundancies. FUJI-Viewer is based on two-dimensional mapping method and focuses the flow of periodic information. This viewer represents differences between the plan history and the order history to manage a long-term test order history and test plan history concurrently. These viewers also support data entry methods that input order-related data efficiently and accurately. This interface reduces the workload of the medical stuff. These intelligent viewers are incorporated into a clinical information system.

Ambient Intelligence (AmI) provides extended but unobtrusive sensing and computing devices and ubiquitous networking for human/environment interaction. It is a new paradigm in information technology compliant with the international Integrating Healthcare Enterprise board (IHE) and eHealth HL7 technological standards in the functional integration of biomedical domotics and informatics in hospital and home care. AmI allows real-time automatic recording of biological/medical information and environmental data. It is extensively applicable to patient monitoring, medicine and neuroscience research, which require large biomedical data sets; for example, in the study of spontaneous or condition-dependent variability or chronobiology. In this respect, AML is equivalent to a traditional laboratory for data collection and processing, with minimal dedicated equipment, staff, and costs; it benefits from the integration of artificial intelligence technology with traditional/innovative sensors to monitor clinical or functional parameters. A prototype AmI platform (MIMERICA*) has been implemented and is operated in a semi-intensive unit for the vegetative and minimally conscious states, to investigate the spontaneous or environment-related fluctuations of physiological parameters in these conditions.

Some of the 12 conference papers presented in this proceedings focus on the present and potential capabilities of artificial intelligence and expert systems as they relate to a wide range of library applications, including descriptive cataloging, technical services, collection development, subject indexing, reference services, database searching,…

In the present study, the relationship between performance on temporal and pitch discrimination and psychometric intelligence was investigated in a sample of 164 participants by means of an experimental dissociation paradigm. Performance on both temporal and pitch discrimination was substantially related to psychometric intelligence (r=.43 and r =.39). Regression analysis and structural equation modeling suggested that both psychophysical domains can be considered as valid predictors of psychometric intelligence. Both predictor variables contributed substantial portions of both shared and unique variance to the prediction of individual differences in psychometric intelligence. Thus, the present study yielded further evidence for a functional relationship between psychometric intelligence and temporal as well as pitch discrimination acuity. Eventually, findings are consistent with the notion that temporal discrimination - in addition to general aspects of sensory discrimination shared with pitch discrimination reflects specific intelligence-related aspects of neural information processing.

The aim of the study was to determine whether the general intelligence, cognitive processes, school achievement, and intelligence-achievement relationship of adolescents with subclinical levels of obsessive-compulsive symptoms differed from those of their normal counterparts. From an initial large pool of 14-year-old Bengali students in eighth…

In this paper, we describe an incrementally generated fuzzy neural network (FNN) for intelligent data processing. This FNN combines the features of initial fuzzy model self-generation, fast input selection, partition validation, parameter optimization and rule-base simplification. A small FNN is created from scratch -- there is no need to specify the initial network architecture, initial membership functions, or initial weights. Fuzzy IF-THEN rules are constantly combined and pruned to minimize the size of the network while maintaining accuracy; irrelevant inputs are detected and deleted, and membership functions and network weights are trained with a gradient descent algorithm, i.e., error backpropagation. Experimental studies on synthesized data sets demonstrate that the proposed Fuzzy Neural Network is able to achieve accuracy comparable to or higher than both a feedforward crisp neural network, i.e., NeuroRule, and a decision tree, i.e., C4.5, with more compact rule bases for most of the data sets used in our experiments. The FNN has achieved outstanding results for cancer classification based on microarray data. The excellent classification result for Small Round Blue Cell Tumors (SRBCTs) data set is shown. Compared with other published methods, we have used a much fewer number of genes for perfect classification, which will help researchers directly focus their attention on some specific genes and may lead to discovery of deep reasons of the development of cancers and discovery of drugs.

BJC HealthCare (BJC) uses a number of industry standard indicators to monitor the quality of services provided by each of its hospitals. By establishing an enterprise data warehouse as a central repository of clinical quality information, BJC is able to monitor clinical quality performance in a timely manner and improve clinical outcomes.

The purpose of this study was to explore self-efficacy, an individual's beliefs about his or her ability to perform a series of tasks, and emotional intelligence, an individual's ability to perceive, use, understand, and manage emotions, as predictors for successful clinical performance in nursing students. The participants were 49 female and 7…

The final goal of this project was the development of a system that is capable of controlling an industrial process effectively through the integration of information obtained through intelligent sensor fusion and intelligent control technologies. The industry of interest in this project was the metal casting industry as represented by cupola iron-melting furnaces. However, the developed technology is of generic type and hence applicable to several other industries. The system was divided into the following four major interacting components: 1. An object oriented generic architecture to integrate the developed software and hardware components @. Generic algorithms for intelligent signal analysis and sensor and model fusion 3. Development of supervisory structure for integration of intelligent sensor fusion data into the controller 4. Hardware implementation of intelligent signal analysis and fusion algorithms

Clinical cases are primary and vital evidence for Traditional Chinese Medicine (TCM) clinical research. A great deal of medical knowledge is hidden in the clinical cases of the highly experienced TCM practitioner. With a deep Chinese culture background and years of clinical experience, an experienced TCM specialist usually has his or her unique clinical pattern and diagnosis idea. Preserving huge clinical cases of experienced TCM practitioners as well as exploring the inherent knowledge is then an important but arduous task. The novel system ISMAC (Intelligent System for Management and Analysis of Clinical Cases in TCM) is designed and implemented for customized management and intelligent analysis of TCM clinical data. Customized templates with standard and expert-standard symptoms, diseases, syndromes, and Chinese Medince Formula (CMF) are constructed in ISMAC, according to the clinical diagnosis and treatment characteristic of each TCM specialist. With these templates, clinical cases are archived in order to maintain their original characteristics. Varying data analysis and mining methods, grouped as Basic Analysis, Association Rule, Feature Reduction, Cluster, Pattern Classification, and Pattern Prediction, are implemented in the system. With a flexible dataset retrieval mechanism, ISMAC is a powerful and convenient system for clinical case analysis and clinical knowledge discovery. PMID:26495425

Diagrams are presented on an intelligent model based system for the design and validation of welding processes. Key capabilities identified include `right the first time` manufacturing, continuous improvement, and on-line quality assurance.

Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can 'learn', automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can "learn," automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

Auditory processing disorders (APDs) are of interest to educators and clinicians, as they impact school functioning. Little work has been completed to demonstrate how children with APDs perform on clinical tests. In a series of studies, standard clinical (psychometric) tests from the Wechsler Intelligence Scale for Children, Fourth Edition…

To demonstrate to computer programmers that the programming language Ada provides superior facilities for use in artificial intelligence applications, the three papers included in this report investigate the capabilities that exist within Ada for "pattern-directed" programming. The first paper (Larry H. Reeker, Tulane University) is…

Emotional intelligence has emerged as a key factor in differentiating average from outstanding performers in managerial and leadership positions across multiple business settings, but relatively few studies have examined the role of emotional intelligence in the health care professions. The purpose of this study was to examine the relationship between emotional intelligence (EI) and dental student clinical performance. All third- and fourth-year students at a single U.S. dental school were invited to participate. Participation rate was 74 percent (100/136). Dental students' EI was assessed using the Emotional Competence Inventory-University version (ECI-U), a seventy-two-item, 360-degree questionnaire completed by both self and other raters. The ECI-U measured twenty-two EI competencies grouped into four clusters (Self-Awareness, Self-Management, Social Awareness, and Relationship Management). Clinical performance was assessed using the mean grade assigned by clinical preceptors. This grade represents an overall assessment of a student's clinical performance including diagnostic and treatment planning skills, time utilization, preparation and organization, fundamental knowledge, technical skills, self-evaluation, professionalism, and patient management. Additional variables were didactic grade point average (GPA) in Years 1 and 2, preclinical GPA in Years 1 and 2, Dental Admission Test academic average and Perceptual Ability Test scores, year of study, age, and gender. Multiple linear regression analyses were conducted. The Self-Management cluster of competencies (b=0.448, p<0.05) and preclinical GPA (b=0.317, p<0.01) were significantly correlated with mean clinical grade. The Self-Management competencies were emotional self-control, achievement orientation, initiative, trustworthiness, conscientiousness, adaptability, and optimism. In this sample, dental students' EI competencies related to Self-Management were significant predictors of mean clinical grade

We examined life span changes in 5 domains of cognitive performance as predictive of mortality risk. Data came from the Manchester Longitudinal Study of Cognition, a 20-plus-year investigation of 6,203 individuals ages 42-97 years. Cognitive domains were general crystallized intelligence, general fluid intelligence, verbal memory, visuospatial memory, and processing speed. Life span decrements were evident across these domains, controlling for baseline performance at age 70 and adjusting for retest effects. Survival analyses stratified by sex and conducted independently by cognitive domain showed that lower baseline performance levels in all domains-and larger life span decrements in general fluid intelligence and processing speed-were predictive of increased mortality risk for both women and men. Critically, analyses of the combined predictive power of cognitive performance variables showed that baseline levels of processing speed (in women) and general fluid intelligence (in men), and decrements in processing speed (in women and in men) and general fluid intelligence (in women), accounted for most of the explained variation in mortality risk. In light of recent evidence from brain-imaging studies, we speculate that cognitive abilities closely linked to cerebral white matter integrity (such as processing speed and general fluid intelligence) may represent particularly sensitive markers of mortality risk. In addition, we presume that greater complexity in cognition-survival associations observed in women (in analyses incorporating all cognitive predictors) may be a consequence of longer and more variable cognitive declines in women relative to men.

This research examined if mothers' day-to-day praise of children's success in school plays a role in children's theory of intelligence and motivation. Participants were 120 children (mean age = 10.23 years) and their mothers who took part in a 2-wave study spanning 6 months. During the first wave, mothers completed a 10-day daily interview in which they reported on their use of person (e.g., "You are smart") and process (e.g., "You tried hard") praise. Children's entity theory of intelligence and preference for challenge in school were assessed with surveys at both waves. Mothers' person, but not process, praise was predictive of children's theory of intelligence and motivation: The more person praise mothers used, the more children subsequently held an entity theory of intelligence and avoided challenge over and above their earlier functioning on these dimensions.

Clinical trials usually have a demand to collect, track and analyze multimedia data according to the workflow. Currently, the clinical trial data management requirements are normally addressed with custom-built systems. Challenges occur in the workflow design within different trials. The traditional pre-defined custom-built system is usually limited to a specific clinical trial and normally requires time-consuming and resource-intensive software development. To provide a solution, we present a user customizable imaging informatics-based intelligent workflow engine system for managing stroke rehabilitation clinical trials with intelligent workflow. The intelligent workflow engine provides flexibility in building and tailoring the workflow in various stages of clinical trials. By providing a solution to tailor and automate the workflow, the system will save time and reduce errors for clinical trials. Although our system is designed for clinical trials for rehabilitation, it may be extended to other imaging based clinical trials as well.

The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI) both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel-processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of this paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual property and that there is a need for better documentation, evaluation and regulation of the systems already being used widely in clinical laboratories.

The purpose of this study is to explore intelligenceprocesses and procedures as they apply to organizational learning in higher education settings. This exploration seeks to identify key components and processes in higher education institutions that were previously identified in the research as important and integral to the discipline of…

Background Many people with mobility impairments, who require the use of powered wheelchairs, have difficulty completing basic maneuvering tasks during their activities of daily living (ADL). In order to provide assistance to this population, robotic and intelligent system technologies have been used to design an intelligent powered wheelchair (IPW). This paper provides a comprehensive overview of the design and validation of the IPW. Methods The main contributions of this work are three-fold. First, we present a software architecture for robot navigation and control in constrained spaces. Second, we describe a decision-theoretic approach for achieving robust speech-based control of the intelligent wheelchair. Third, we present an evaluation protocol motivated by a meaningful clinical outcome, in the form of the Robotic Wheelchair Skills Test (RWST). This allows us to perform a thorough characterization of the performance and safety of the system, involving 17 test subjects (8 non-PW users, 9 regular PW users), 32 complete RWST sessions, 25 total hours of testing, and 9 kilometers of total running distance. Results User tests with the RWST show that the navigation architecture reduced collisions by more than 60% compared to other recent intelligent wheelchair platforms. On the tasks of the RWST, we measured an average decrease of 4% in performance score and 3% in safety score (not statistically significant), compared to the scores obtained with conventional driving model. This analysis was performed with regular users that had over 6 years of wheelchair driving experience, compared to approximately one half-hour of training with the autonomous mode. Conclusions The platform tested in these experiments is among the most experimentally validated robotic wheelchairs in realistic contexts. The results establish that proficient powered wheelchair users can achieve the same level of performance with the intelligent command mode, as with the conventional command mode

The use of computers has been growing rapidly in manufacturing since computer technology started to be used in machine controls. However, due to historical reasons, these technologies are isolated and their information cannot be shared. For example, to use the information from the CAD system for machining, a human expert engineer has to interpret the CAD meanings and then enters them into a CAM format. Integration of these manufacturing functions requires a technology break-through in automation of manufacturing technology. This paper describes the development of an integrated web-based CIM environment called J-MOPS which can intelligently transform CAD information into machine programs. The use of Java significantly enhanced the system with the advantages of robustness, object-oriented, network oriented, and platform independence.

Article deals with point to using intelligent relay and PLC systems in practice, to their architecture and principles of programming and simulations for education process on all types of school from secondary to universities. Aim of the article is proposal of simple examples of applications, where is demonstrated methodology of programming on real simple practice examples and shown using of chosen instructions. In practical part is described process of creating schemas and describing of function blocks, where are described methodologies of creating program and simulations of output reactions on changeable inputs for intelligent relays.

Background ClinicalIntelligence, as a research and engineering discipline, is dedicated to the development of tools for data analysis for the purposes of clinical research, surveillance, and effective health care management. Self-service ad hoc querying of clinical data is one desirable type of functionality. Since most of the data are currently stored in relational or similar form, ad hoc querying is problematic as it requires specialised technical skills and the knowledge of particular data schemas. Results A possible solution is semantic querying where the user formulates queries in terms of domain ontologies that are much easier to navigate and comprehend than data schemas. In this article, we are exploring the possibility of using SADI Semantic Web services for semantic querying of clinical data. We have developed a prototype of a semantic querying infrastructure for the surveillance of, and research on, hospital-acquired infections. Conclusions Our results suggest that SADI can support ad-hoc, self-service, semantic queries of relational data in a ClinicalIntelligence context. The use of SADI compares favourably with approaches based on declarative semantic mappings from data schemas to ontologies, such as query rewriting and RDFizing by materialisation, because it can easily cope with situations when (i) some computation is required to turn relational data into RDF or OWL, e.g., to implement temporal reasoning, or (ii) integration with external data sources is necessary. PMID:23497556

One hundred and sixteen third-year dental students participating in a consultation skills course in Dunedin, New Zealand, completed a standardized psychometric Social Skills Inventory (SSI) and were assessed by tutors, simulated patients, and themselves. Students with higher social skills abilities obtained higher performance scores and demonstrated better interview structure. Patients reported being more likely to return to students for a dental consultation following the second interview, and students' consultation skills were rated (by tutors, patients, and students) higher at the end of the course than the beginning. Female students had higher global social skills abilities and were more emotionally expressive and sensitive than male students, while the latter had better emotional control. Female students performed better in the first interview than male students, but there was no significant gender difference in the second interview. Tutor and simulated patient ratings suggested that a consultation skills course can increase the ability of students in general, and English as a second language students in particular, to relate to their patients, manage anxiety, identify ethical issues, and recognize significant psychosocial issues that lead to more accurate diagnosis and treatment processes, ensuring the effective delivery of patient-centered dental education.

Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligenceprocess; and discusses education and training opportunities for competitive intelligence, including core competencies…

According to mental speed theory of intelligence, the speed of information processing constitutes an important basis for cognitive abilities. However, the question, how mental speed relates to real world criteria, like school, academic, or job performance, is still unanswered. The aim of the study is to test an indirect speed-factor model in…

Investigated the validity of the Luria-Nebraska Intellectual Processes Scale (IPS) as a substitute for the Wechsler Adult Intelligence Scale (WAIS). IPS scores were correlated with the three WAIS IQs, and regression equations were computed to obtain estimated Verbal IQ, Performance IQ, and Full Scale IQ. (Author)

The relationships between processing speed, intelligence, and school achievement were analyzed on a sample of 184 Russian 16-year-old students. Two speeded tasks required the discrimination of simple geometrical shapes and the recognition of the presented meaningless figures. Raven's Advanced Progressive Matrices and the verbal subtests of…

Domain-specific contributions of working memory (WM), short-term memory (STM), and executive functioning (EF) to individual differences in intelligence were analysed using a latent variable approach. A sample of 345 participants completed a battery of 24 tests tapping the constructs of interests as comprehensively as possible. Visuospatial and verbal STM and WM tasks were administered along with three subcomponents of EF, namely inhibition, planning, and shifting. Intelligence was assessed by non-verbal/abstract/fluid intelligence (Gf) and verbal/crystallised intelligence (Gc) standardised tests. Structural equation modelling results show that EF is the main predictor of Gf, whereas verbal STM is the main predictor of Gc. Storage and processing providing different contributions to the prediction of Gf and Gc supports the view that both short-term storage and executive functioning account for the relationship between WM and intelligence. This main conclusion stresses the importance of acknowledging core cognitive constructs as being hierarchical systems with general and domain-specific mechanisms.

Nontechnical competencies identified as essential to the health professional's success include ethical behavior, interpersonal, self-management, leadership, business, and thinking competencies. The literature regarding such diverse topics, and the literature regarding "professional success" is extensive and wide-ranging, crossing educational, psychological, business, medical and vocational fields of study. This review is designed to introduce ways of viewing nontechnical competence from the psychology of human capacity to current perspectives, initiatives and needs in practice. After an introduction to the tensions inherent in educating individuals for both biomedical competency and "bedside" or "cageside" manner, the paper presents a brief overview of the major lines of inquiry into intelligence theory and how theories of multiple intelligences can build a foundation for conceptualizing professional and life skills. The discussion then moves from broad concepts of intelligence to more specific workplace skill sets, with an emphasis on professional medical education. This section introduces the research on noncognitive variables in various disciplines, the growing emphasis on competency based education, and the SKA movement in veterinary education. The next section presents the evidence that nontechnical, noncognitive or humanistic skills influence achievement in academic settings, medical education and clinical performance, as well as the challenges faced when educational priorities must be made.

An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

In this project. process analysis of Resin Transfer Molding (RTM) was carried out and adaptive process control models were developed. In addition, a...aforementioned work in three separate sections: (1) process analysis and adaptive control modeling, (2) manufacturing of non-invasive sensor, end (3) list of publications resulting from this project.

Impairments in emotional intelligence (EI) have been found in individuals with high general and social anxiety; however, no studies have examined this relationship in a clinically depressed population. Thirty-one patients (11 male, 20 female) with a DSM-IV-TR diagnosis of a major affective disorder and 28 non-clinical controls (5 male, 23 female) completed self-report instruments assessing EI, depression and social anxiety. Compared to a control group, the clinical group scored lower on the EI dimensions of Emotional Recognition and Expression, Understanding Emotions, Emotional Management, and Emotional Control. Regression analyses revealed Emotional Control was a significant predictor of interaction, performance, and generalised social anxiety. Self-report measures of EI may have predictive value in terms of early identification of those at risk of developing social anxiety and depression. The current study points to the potential value of conducting further studies of a prospective nature.

A scheduling system incorporating AI is described and applied to the automated processing of the Space Shuttle. The unique problem of addressing the temporal, resource, and orbiter-configuration requirements of shuttle processing is described with comparisons to traditional project management for manufacturing processes. The present scheduling system is developed to handle the late inputs and complex programs that characterize shuttle processing by incorporating fixed preemptive scheduling, constraint-based simulated annealing, and the characteristics of an 'anytime' algorithm. The Space-Shuttle processing environment is modeled with 500 activities broken down into 4000 subtasks and with 1600 temporal constraints, 8000 resource constraints, and 3900 state requirements. The algorithm is shown to scale to very large problems and maintain anytime characteristics suggesting that an automated scheduling process is achievable and potentially cost-effective.

Although, in the course of the last 50 years, the achievements in the medical field have been astonishing, at the beginning of the third millennium a number of clinical pictures are still left without a precise nosographic origin. In the past, the delay in scientific communication was the main explanation presented for the lack of understanding of clinical pictures of unknown nosographic origin. The history of medicine provides excellent examples of this dispersion of human capital, even if the history of clinical neurology presents "exceptions" (the pictures that we now call de la Tourette's syndrome and Parkinson's disease) that indicate that major clinical syndromes could be clearly detected and relatively rapidly diffused even in the 19th century. Contrary to the past, the delay in scientific communication no longer seems an obstacle to the sharing of medical knowledge. Nevertheless, the problem of the in-depth comprehension of clinical pictures of unknown nosographic origin still remains dominant, mainly because of the limited spread of ample and flexible online accessible databases of unknown nosographic origin clinical syndromes. The need for interactive electronic archives and other artificial intelligence resources in order to promote progress in clinical knowledge is discussed in this paper.

Studied the interrelationships among general fluid intelligence, short-term memory capacity, working memory capacity, and processing speed in 120 young adults and used structural equation modeling to determine the best predictor of general fluid intelligence. Results suggest that working memory capacity, but not short-term memory capacity or…

In order to reduce time and improve the probability of successful matting it is useful to co-ordinate between the movement control and mating process of the underwater vehicle. Because it is hard to control with the common method under the condition of mating process, the hierarchical intelligent control is introduced. Timed fuzzy Petri net (TFPN), which is the integration of PN and fuzzy reasoning, is used in the design of coordinate level of hierarchically intelligent control. It made the control process better in reflecting the characteristics of time-driven, event-driven, fuzzy information and so on. The test shows that TF-PN could shorten the time of mating and enhance the efficiency.

Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

The present system for automatic learning/evaluation of novel heuristic methods applicable to the mapping of communication-process sets on a computer network has its basis in the testing of a population of competing heuristic methods within a fixed time-constraint. The TEACHER 4.1 prototype learning system implemented or learning new postgame analysis heuristic methods iteratively generates and refines the mappings of a set of communicating processes on a computer network. A systematic exploration of the space of possible heuristic methods is shown to promise significant improvement.

There currently exists a critical need for tools to enhance the industrial competitiveness and agility of US industries involved in deformation processing of structural alloys. In response to this need, Sandia National Laboratories has embarked upon the QuikForm Initiative. The goal of this program is the development of computer-based tools to facilitate the design of deformation processing operations. The authors are currently focusing their efforts on the definition/development of a comprehensive system for the design of sheet metal stamping operations. The overall structure of the proposed QuikForm system is presented, and the focus of their thrust in each technical area is discussed.

To improve efficiency, reduce cost, ensure quality effectively, researchers on CNC machining have focused on virtual machine tool, cloud manufacturing, wireless manufacturing. However, low level of information shared among different systems is a common disadvantage. In this paper, a machining database with data evaluation module is set up to ensure integrity and update. An online monitoring system based on internet of things and multi-sensors "feel" a variety of signal features to "percept" the state in CNC machining process. A high efficiency and green machining parameters optimization system "execute" service-oriented manufacturing, intelligent manufacturing and green manufacturing. The intelligent CNC machining system is applied in production. CNC machining database effectively shares and manages process data among different systems. The prediction accuracy of online monitoring system is up to 98.8% by acquiring acceleration and noise in real time. High efficiency and green machining parameters optimization system optimizes the original processing parameters, and the calculation indicates that optimized processing parameters not only improve production efficiency, but also reduce carbon emissions. The application proves that the shared and service-oriented CNC machining system is reliable and effective. This research presents a shared and service-oriented CNC machining system for intelligent manufacturing process.

A knowledge-based system (KBS) was designed for automated system identification, process monitoring, and diagnosis of sensor faults. The real-time KBS consists of a supervisory system using G2 KBS development software linked with external statistical modules for system identification and sensor fault diagnosis. The various statistical techniques were prototyped in MATLAB, converted to ANSI C code, and linked with the G2 Standard Interface. The KBS automatically performs all operations of data collection, identification, monitoring, and sensor fault diagnosis with little or no input from the user. Navigation throughout the KBS is via menu buttons on each user-accessible screen. Selected process variables are displayed on charts showing the history of the variables over a period of time. Multivariate statistical tests and contribution plots are also shown graphically. The KBS was evaluated using simulation studies with a polymerization reactor through a nonlinear dynamic model. Both normal operation conditions as well as conditions of process disturbances were observed to evaluate the KBS performance. Specific user-defined disturbances were added to the simulation, and the KBS correctly diagnosed both process and sensor faults when present.

Raven's matrices and inspection time (IT) were recorded from 56 subjects under five arousal levels. Raven's and IT correlated strongly (r = -0.7) as predicted by processing-speed theories of "g." In line with Eysenck's [Eysenck, H. J. (1967). "The biological basis of personality". Springfield, IL: Thomas] arousal theory of extraversion, there was…

We describe a prototype system for interactive image processing using Prolog, implemented by the first author on an Apple Macintosh computer. This system is inspired by Prolog+, but differs from it in two particularly important respects. The first is that whereas Prolog+ assumes the availability of dedicated image processing hardware, with which the Prolog system communicates, our present system implements image processing functions in software using the C programming language. The second difference is that although our present system supports Prolog+ commands, these are implemented in terms of lower-level Prolog predicates which provide a more flexible approach to image manipulation. We discuss the impact of the Apple Macintosh operating system upon the implementation of the image-processing functions, and the interface between these functions and the Prolog system. We also explain how the Prolog+ commands have been implemented. The system described in this paper is a fairly early prototype, and we outline how we intend to develop the system, a task which is expedited by the extensible architecture we have implemented.

This article examines the interconnection between national intelligence, political institutions, and the mismanagement of public resources (deforestations). The paper examines the reasons for deforestation and investigates the factors accountable for it. The analysis builds on authors-compiled cross-national dataset on 185 countries over the time period of twenty years, from 1990 to 2010. We find that, first, nation's intelligence reduces significantly the level of deforestation in a state. Moreover, the nations' IQ seems to play an offsetting role in the natural resource conservation (forest management) in the countries with weak democratic institutions. The analysis also discovered the presence of the U-shaped relationship between democracy and deforestation. Intelligence sheds more light on this interconnection and explains the results. Our results are robust to various sample selection strategies and model specifications. The main implication from our study is that intelligence not only shapes formal rules and informal regulations such as social trust, norms and traditions but also it has the ability to reverse the paradoxical process known as "resource curse." The study contributes to better understanding of reasons of deforestation and shed light on the debated impact of political regime on forest management.

The 2003 Iraq prewar intelligence failure was not simply a case of the U.S. intelligence community providing flawed data to policy-makers. It also involved subversion of the competitive intelligence analysis process, where unofficial intelligence boutiques "stovepiped" misleading intelligence assessments directly to policy-makers and…

assisted resin transfer molding ( VARTM ) and Seemann Composite Resin Infusion Molding Process (SCRIMPT M). All variations of RTM are suitable for the...numerical simulations have been used to design the vent and gate locations for molds used for RTM , VARTM and SCRIMPTM [2,3,7-14]. Regardless of the research...200 Words) To prevent dry spot formation in RTM , a control interface and four different adaptive control algorithms were developed and tested with

Perception operates on an immense amount of incoming information that greatly exceeds the brain's processing capacity. Because of this fundamental limitation, the ability to suppress irrelevant information is a key determinant of perceptual efficiency. Here, I will review a series of studies investigating suppressive mechanisms in visual motion processing, namely perceptual suppression of large, background-like motions. These spatial suppression mechanisms are adaptive, operating only when sensory inputs are sufficiently robust to guarantee visibility. Converging correlational and causal evidence links these behavioral results with inhibitory center-surround mechanisms, namely those in cortical area MT. Spatial suppression is abnormally weak in several special populations, including the elderly and those with schizophrenia—a deficit that is evidenced by better-than-normal direction discriminations of large moving stimuli. Theoretical work shows that this abnormal weakening of spatial suppression should result in motion segregation deficits, but direct behavioral support of this hypothesis is lacking. Finally, I will argue that the ability to suppress information is a fundamental neural process that applies not only to perception but also to cognition in general. Supporting this argument, I will discuss recent research that shows individual differences in spatial suppression of motion signals strongly predict individual variations in IQ scores. PMID:26299386

Clinical report writing in psychology and psychiatry is addressed. Audience/use analysis and the basic procedures of information gathering, diagnosis, and prognosis are described. Two interlinking processes are involved: the process of creation and the process of communication. Techniques for good report writing are presented.

Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligentprocess in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This

Nearly 2 million Traumatic Brain Injuries (TBI) occur in the U.S. each year, with societal costs approaching $60 billion. Including mild TBI and concussion, TBI's are prevalent in soldiers returning from Iraq and Afghanistan as well as in domestic athletes. Long-term risks of single and cumulative head impact dosage may present in the form of post traumatic stress disorder (PTSD), depression, suicide, Chronic Traumatic Encephalopathy (CTE), dementia, Alzheimer's and Parkinson's diseases. Quantifying head impact dosage and understanding associated risk factors for the development of long-term sequelae is critical toward developing guidelines for TBI exposure and post-exposure management. The current knowledge gap between head impact exposure and clinical outcomes limits the understanding of underlying TBI mechanisms, including effective treatment protocols and prevention methods for soldiers and athletes. In order to begin addressing this knowledge gap, Cleveland Clinic is developing the "Intelligent Mouthguard" head impact dosimeter. Current testing indicates the Intelligent Mouthguard can quantify linear acceleration with 3% error and angular acceleration with 17% error during impacts ranging from 10g to 174g and 850rad/s2 to 10000rad/s2, respectively. Correlation was high (R2 > 0.99, R2 = 0.98, respectively). Near-term development will be geared towards quantifying head impact dosages in vitro, longitudinally in athletes and to test new sensors for possible improved accuracy and reduced bias. Long-term, the IMG may be useful to soldiers to be paired with neurocognitive clinical data quantifying resultant TBI functional deficits.

adjudication; or b) revocation / denial of one’s security clearance/access. • Under the reciprocity policy of Intelligence Community Policy Guidance...issue of conditions, deviations, and waivers in the security clearance process. While guidelines may support security clearance/access revocation ...OS&CI) reports denials and revocations on contractors with [DoD] equities directly to the [DODCAF). All other cases involving derogatory information

A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput.

In extreme scale data processing systems, fault tolerance is an essential and indispensable part. Proactive fault tolerance scheme (such as the speculative execution in MapReduce framework) is introduced to dramatically improve the response time of job executions when the failure becomes a norm rather than an exception. Efficient proactive fault tolerance schemes require precise knowledge on the task executions, which has been an open challenge for decades. To well address the issue, in this paper we design and implement RiskI, a profile-based prediction algorithm in conjunction with a riskaware task assignment algorithm, to accelerate task executions, taking the uncertainty nature of tasks into account. Our design demonstrates that the nature uncertainty brings not only great challenges, but also new opportunities. With a careful design, we can benefit from such uncertainties. We implement the idea in Hadoop 0.21.0 systems and the experimental results show that, compared with the traditional LATE algorithm, the response time can be improved by 46% with the same system throughput. PMID:24883392

Segmentation of regions of interest (ROIs), such as suspect lesions, is a preliminary but vital step for computeraided breast cancer diagnosis, but the task is quite challenging due to image quality and the complicated phenomena that are usually involved with the ROIs. On one hand, it is possible for physicians and clinicians to dig out more information from imaging; on another hand, efficient, robust, and accurate segmentation of such kind of anatomical lesions is often a difficult and open task to researcher and technical development. As a counterbalance between automatic methods, which are usually highly application dependent, and manual approaches, which are too time consuming, live wire, which provide full user control during segmentation while minimizing user interaction, is a promising option for assisting in breast lesion segmentation in ultrasound (US) images. This work proposes a live-wire-based adjustment method to further extend its potentials in computer-aided diagnosis (CAD) applications. It allows for local boundary adjustment, based on the live-wire paradigms, for a given segmentation, and can be attached as a post-process step to the live wire method or other segmentation approaches.

Advances in imaging technology and computer science have greatly enhanced interpretation of medical images, and contributed to early diagnosis. The typical architecture of a Computer Aided Diagnosis (CAD) system includes image pre-processing, definition of region(s) of interest, features extraction and selection, and classification. In this paper, the principles of CAD systems design and development are demonstrated by means of two examples. The first one focuses on the differentiation between symptomatic and asymptomatic carotid atheromatous plaques. For each plaque, a vector of texture and motion features was estimated, which was then reduced to the most robust ones by means of ANalysis of VAriance (ANOVA). Using fuzzy c-means, the features were then clustered into two classes. Clustering performances of 74%, 79%, and 84% were achieved for texture only, motion only, and combinations of texture and motion features, respectively. The second CAD system presented in this paper supports the diagnosis of focal liver lesions and is able to characterize liver tissue from Computed Tomography (CT) images as normal, hepatic cyst, hemangioma, and hepatocellular carcinoma. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of neural network classifiers. The achieved classification performance was 100%, 93.75% and 90.63% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

The paper presents the hardware implementation and initial tests from a low-power, highspeed reconfigurable sensor fusion processor. The Extended Logic IntelligentProcessing System (ELIPS) is described, which combines rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor signals in compact low power VLSI. The development of the ELIPS concept is being done to demonstrate the interceptor functionality which particularly underlines the high speed and low power requirements. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Processing speeds of microseconds have been demonstrated using our test hardware.

The present study examined relationships between emotional intelligence, measured by the Mayer-Salovey-Caruso Emotional Intelligence Test, and right hemisphere dominance for a free vision chimeric face test. A sample of 122 ethnically diverse college students participated and completed online versions of the forenamed tests. A hierarchical regression was performed to test for the hypothesized interaction between gender and EI on the right hemisphere bias score. No significant main effects were found for gender or total EI score. However, when entered into the model, the interaction term contributed an additional 4.5% of the variance in right hemisphere dominance for the processing of facial emotions. Descriptively, men with greater EI were associated with higher right hemisphere dominance in the free vision test, while no association was observed for women.

The U.S. Navy has had an active Manufacturing Technology (MANTECH) Program aimed at developing advanced production processes and equipment since the late-1960's. During the past decade, however, the resources of the MANTECH program were concentrated in Centers of Excellence. Today, the Navy sponsors four manufacturing technology Centers of Excellence: the Automated Manufacturing Research Facility (AMRF); the Electronics Manufacturing Productivity Facility (EMPF); the National Center for Excellence in Metalworking Technology (NCEMT); and the Center of Excellence for Composites Manufacturing Technology (CECMT). This paper briefly describes each of the centers and summarizes typical Intelligent Equipment Processing (IEP) projects that were undertaken.

Intelligent behavior is a complex adaptive phenomenon that has evolved to enable organisms to deal with variable environmental circumstances. Maximizing fitness requires skill in foraging for necessary resources (food) in competitive circumstances and is probably the activity in which intelligent behavior is most easily seen. Biologists suggest that intelligence encompasses the characteristics of detailed sensory perception, information processing, learning, memory, choice, optimisation of resource sequestration with minimal outlay, self-recognition, and foresight by predictive modeling. All these properties are concerned with a capacity for problem solving in recurrent and novel situations. Here I review the evidence that individual plant species exhibit all of these intelligent behavioral capabilities but do so through phenotypic plasticity, not movement. Furthermore it is in the competitive foraging for resources that most of these intelligent attributes have been detected. Plants should therefore be regarded as prototypical intelligent organisms, a concept that has considerable consequences for investigations of whole plant communication, computation and signal transduction.

In this study, a novel intelligent factory-wide operation system for a continuous production process is designed to optimise the entire production process, which consists of multiple units; furthermore, this system is developed using process operational data to avoid the complexity of mathematical modelling of the continuous production process. The data-driven approach aims to specify the structure of the optimal operation system; in particular, the operational data of the process are used to formulate each part of the system. In this context, the domain knowledge of process engineers is utilised, and a closed-loop dynamic optimisation strategy, which combines feedback, performance prediction, feed-forward, and dynamic tuning schemes into a framework, is employed. The effectiveness of the proposed system has been verified using industrial experimental results.

Several binaural audio signal enhancement algorithms were evaluated with respect to their potential to improve speech intelligibility in noise for users of bilateral cochlear implants (CIs). 50% speech reception thresholds (SRT50) were assessed using an adaptive procedure in three distinct, realistic noise scenarios. All scenarios were highly nonstationary, complex, and included a significant amount of reverberation. Other aspects, such as the perfectly frontal target position, were idealized laboratory settings, allowing the algorithms to perform better than in corresponding real-world conditions. Eight bilaterally implanted CI users, wearing devices from three manufacturers, participated in the study. In all noise conditions, a substantial improvement in SRT50 compared to the unprocessed signal was observed for most of the algorithms tested, with the largest improvements generally provided by binaural minimum variance distortionless response (MVDR) beamforming algorithms. The largest overall improvement in speech intelligibility was achieved by an adaptive binaural MVDR in a spatially separated, single competing talker noise scenario. A no-pre-processing condition and adaptive differential microphones without a binaural link served as the two baseline conditions. SRT50 improvements provided by the binaural MVDR beamformers surpassed the performance of the adaptive differential microphones in most cases. Speech intelligibility improvements predicted by instrumental measures were shown to account for some but not all aspects of the perceptually obtained SRT50 improvements measured in bilaterally implanted CI users.

Previous research overwhelmingly suggests that feelings of ease people experience while processing information lead them to infer that their comprehension is high, whereas feelings of difficulty lead them to infer that their comprehension is low. However, the inferences people draw from their experiences of processing fluency should also vary in accordance with their naive theories about why new information might be easy or difficult to process. Five experiments that involved reading novel texts showed that participants who view intelligence as a fixed attribute, and who tend to interpret experiences of processing difficulty as an indication that they are reaching the limits of their ability, reported lower levels of comprehension as fluency decreased. In contrast, participants who view intelligence as a malleable attribute that develops through effort, and who do not tend to interpret experiences of processing difficulty as pertaining to some innate ability, did not report lower levels of comprehension as fluency decreased. In fact, when these participants were particularly likely to view effort as leading to increased mastery, decreases in fluency led them to report higher levels of comprehension.

Structural and incremental validity of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV; Wechsler, 2008a) was examined with a sample of 300 individuals referred for evaluation at a university-based clinic. Confirmatory factor analysis indicated that the WAIS-IV structure was best represented by 4 first-order factors as well as a general intelligence factor in a direct hierarchical model. The general intelligence factor accounted for the most common and total variance among the subtests. Incremental validity analyses indicated that the Full Scale IQ (FSIQ) generally accounted for medium to large portions of academic achievement variance. For all measures of academic achievement, the first-order factors combined accounted for significant achievement variance beyond that accounted for by the FSIQ, but individual factor index scores contributed trivial amounts of achievement variance. Implications for interpreting WAIS-IV results are discussed.

Contemporary creativity research views intelligence and creativity as essentially unrelated abilities, and many studies have found only modest correlations between them. The present research, based on improved approaches to creativity assessment and latent variable modeling, proposes that fluid and executive cognition is in fact central to…

Clinical outcome prediction, as strong implications for health service delivery of clinical treatment processes (CTPs), is important for both patients and healthcare providers. Prior studies typically use a priori knowledge, such as demographics or patient physical factors, to estimate clinical outcomes at early stages of CTPs (e.g., admission). They lack the ability to deal with temporal evolution of CTPs. In addition, most of the existing studies employ data mining or machine learning methods to generate a prediction model for a specific type of clinical outcome, however, a mathematical model that predicts multiple clinical outcomes simultaneously, has not yet been established. In this study, a hybrid approach is proposed to provide a continuous predictive monitoring service on multiple clinical outcomes. More specifically, a probabilistic topic model is applied to discover underlying treatment patterns of CTPs from electronic medical records. Then, the learned treatment patterns, as low-dimensional features of CTPs, are exploited for clinical outcome prediction across various stages of CTPs based on multi-label classification. The proposal is evaluated to predict three typical classes of clinical outcomes, i.e., length of stay, readmission time, and the type of discharge, using 3492 pieces of patients' medical records of the unstable angina CTP, extracted from a Chinese hospital. The stable model was characterized by 84.9% accuracy and 6.4% hamming-loss with 3 latent treatment patterns discovered from data, which outperforms the benchmark multi-label classification algorithms for clinical outcome prediction. Our study indicates the proposed approach can potentially improve the quality of clinical outcome prediction, and assist physicians to understand the patient conditions, treatment inventions, and clinical outcomes in an integrated view.

The results presented in this article demonstrate the potential of artificial intelligence tools for predicting the endpoint of the granulation process in high-speed mixer granulators of different scales from 25L to 600L. The combination of neurofuzzy logic and gene expression programing technologies allowed the modeling of the impeller power as a function of operation conditions and wet granule properties, establishing the critical variables that affect the response and obtaining a unique experimental polynomial equation (transparent model) of high predictability (R(2) > 86.78%) for all size equipment. Gene expression programing allowed the modeling of the granulation process for granulators of similar and dissimilar geometries and can be improved by implementing additional characteristics of the process, as composition variables or operation parameters (e.g., batch size, chopper speed). The principles and the methodology proposed here can be applied to understand and control manufacturing process, using any other granulation equipment, including continuous granulation processes.

This paper presents the design and implementation of an intelligent data processing system for a wireless sensor node for healthcare application. The data processing system comprises of front-end sensors and a data acquisition (DAQ) system for signal processing. A smart property for the system has been developed so that it automatically selects the optimum method to 'condition' the biosignals, depending on the input channel requirements for better system accuracy. Moreover, it correspondingly selects an optimal sampling speed for each input channel to reduce the system power consumption, data storage and cost. Results show that a 47% reduction in power consumption is achieved and the aliasing error is reduced by 31% when the smart data processing architecture is used instead of traditional fix-rate data processing system.

Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously. PMID:21845020

Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

In an adaptive listening test, the bandwidth of speech in complementary notched noise was varied. The bandwidth (center frequency 1 kHz) required for 50% speech intelligibility is called Speech Reception Bandwidth Threshold (SRBT). The SRBT was measured for 10 normal-hearing and 30 hearing-impaired listeners. The average SRBT of the normal-hearing listeners is 1.4 octave. The performance of seven hearing-impaired listeners is considered normal, whereas 23 hearing-impaired listeners have a wider-than-normal SRBT. The SRBT of a hearing-impaired listener may be wider than normal, due to inaudibility of a part of the speech band, or to an impairment in the processing of speech. The Speech Intelligibility Index (SII) is used to separate these effects. The SII may be regarded as the proportion of the total speech information that is available to the listener. Each individual SRBT is converted to an SII value. For the normal-hearing listeners, the SII is about 0.3. For 21 hearing-impaired listeners, the SII is higher. This points to a speech-processing impairment in the 1-kHz frequency region. The deviation of an individual SII value from 0.3 can be used to "quantify" the degree of processing impairment.

The recent earthquakes caused severe damages on the existing buildings. By this motivation, an important amount of research work has been conducted to determine the seismic risk of seismically active regions. For an accurate seismic risk assessment, processing of ground motions would provide an advantage. Using the current technology, it is not possible to precisely predict the future earthquakes. Therefore, most of the current seismic risk assessment methodologies are based on statistical evaluation by using recurrence and magnitude of the earthquakes hit the specified region. Because of the limited number of records on earthquakes, the quality of definitions is questionable. Fuzzy logic algorithm can be used to improve the quality of the definition. In the present study, ground motion data profile of western Turkey is defined using an intelligent hybrid processing. The approach is given in a practical way for an easier and faster calculation. Earthquake data between 1970 and 1999 from western part of Turkey have been used for training. The results are tested and validated with the earthquake data between 2000 and 2015 of the same region. Enough approximation was validated between calculated values and the earthquake data by using the intelligent hybrid processing.

Background Research on cognitive control suggests an age-related decline in proactive control abilities whereas reactive control seems to remain intact. However, the reason of the differential age effect on cognitive control efficiency is still unclear. This study investigated the potential influence of fluid intelligence and processing speed on the selective age-related decline in proactive control. Eighty young and 80 healthy older adults were included in this study. The participants were submitted to a working memory recognition paradigm, assessing proactive and reactive cognitive control by manipulating the interference level across items. Results Repeated measures ANOVAs and hierarchical linear regressions indicated that the ability to appropriately use cognitive control processes during aging seems to be at least partially affected by the amount of available cognitive resources (assessed by fluid intelligence and processing speed abilities). Conclusions This study highlights the potential role of cognitive resources on the selective age-related decline in proactive control, suggesting the importance of a more exhaustive approach considering the confounding variables during cognitive control assessment. PMID:24401034

The development of new processing technologies for the production, fabrication, and application of advanced materials proceeds through several complementary dimensions. The advanced materials dimension includes basic research on materials synthesis, composition, and properties; materials processing research; engineering characterization and materials applications; and product and process engineering. The health and environmental dimension includes identification of potential health and environmental constraints; characterization of candidate processes for waste and effluent quality; process optimization for both economic and environmental benefit; and development of control strategies to deal with health and environmental problems that cannot be solved through process modification. The intelligentprocessing dimension includes application of available sensors and the development of new diagnostics for real-time process measurements; development of control strategies and expert systems to use these process measurements for real-time process control; and development of capabilities to optimize working processes in real-time for both product quality and environmental acceptability. This paper discusses these issues in the context of the Laboratory's efforts to develop technologies based on the processing of the new high-temperature superconducting ceramic oxides.

Mammography is the most effective method for breast cancer screening available today. However, the low positive predictive value of breast biopsy resulting from mammogram interpretation leads to approximately 70% unnecessary biopsies with benign outcomes. To reduce the high number of unnecessary breast biopsies, several computer-aided diagnosis (CAD) systems have been proposed in the last several years. These systems help physicians in their decision to perform a breast biopsy on a suspicious lesion seen in a mammogram or to perform a short term follow-up examination instead. We present two novel CAD approaches that both emphasize an intelligible decision process to predict breast biopsy outcomes from BI-RADS findings. An intelligible reasoning process is an important requirement for the acceptance of CAD systems by physicians. The first approach induces a global model based on decison-tree learning. The second approach is based on case-based reasoning and applies an entropic similarity measure. We have evaluated the performance of both CAD approaches on two large publicly available mammography reference databases using receiver operating characteristic (ROC) analysis, bootstrap sampling, and the ANOVA statistical significance test. Both approaches outperform the diagnosis decisions of the physicians. Hence, both systems have the potential to reduce the number of unnecessary breast biopsies in clinical practice. A comparison of the performance of the proposed decision tree and CBR approaches with a state of the art approach based on artificial neural networks (ANN) shows that the CBR approach performs slightly better than the ANN approach, which in turn results in slightly better performance than the decision-tree approach. The differences are statistically significant (p value <0.001). On 2100 masses extracted from the DDSM database, the CRB approach for example resulted in an area under the ROC curve of A(z)=0.89{+-}0.01, the decision-tree approach in A(z)=0

Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

Emotional intelligence is increasingly recognized as being important for professional career success. Skills related to emotional intelligence (e.g. organizational commitment, public speaking, teamwork, and leadership) are considered essential. Human resource professionals have begun including tests of emotional intelligence (EI) in job applicant…

In this article, we revise and try to resolve some of the problems inherent in questionnaire screening of sleep apnea cases and apnea diagnosis based on attributes which are relevant and reliable. We present a way of learning information about the relevance of the data, comparing this with the definition of the information by the medical expert. We generate a predictive data model using a data aggregation operator which takes relevance and reliability information about the data into account to produce a diagnosis for each case. We also introduce a grade of membership for each question response which allows the patient to indicate a level of confidence or doubt in their own judgement. The method is tested with data collected from patients in a Sleep Clinic using questionnaires specially designed for the study. Other artificial intelligence predictive modeling algorithms are also tested on the same data and their predictive accuracy compared to that of the aggregation operator.

We propose to construct an intelligent system for clinical guidance on how to effectively use power wheelchair tilt and recline functions. The motivations fall into the following two aspects. (1) People with spinal cord injury (SCI) are vulnerable to pressure ulcers. SCI can lead to structural and functional changes below the injury level that may predispose individuals to tissue breakdown. As a result, pressure ulcers can significantly affect the quality of life, including pain, infection, altered body image, and even mortality. (2) Clinically, wheelchair power seat function, i.e., tilt and recline, is recommended for relieving sitting-induced pressures. The goal is to increase skin blood flow for the ischemic soft tissues to avoid irreversible damage. Due to variations in the level and completeness of SCI, the effectiveness of using wheelchair tilt and recline to reduce pressure ulcer risks has considerable room for improvement. Our previous study indicated that the blood flow of people with SCI may respond very differently to wheelchair tilt and recline settings. In this study, we propose to use the artificial neural network (ANN) to predict how wheelchair power seat functions affect blood flow response to seating pressure. This is regression learning because the predicted outputs are numerical values. Besides the challenging nature of regression learning, ANN may suffer from the overfitting problem which, when occurring, leads to poor predictive quality (i.e., cannot generalize). We propose using the particle swarm optimization (PSO) algorithm to train ANN to mitigate the impact of overfitting so that ANN can make correct predictions on both existing and new data. Experimental results show that the proposed approach is promising to improve ANN's predictive quality for new data.

The Defense Waste Processing Facility (DWPF) at the Savannah River Plant (SRP) is currently under construction and when completed will process high-level radioactive waste into a borosilicate glass wasteform. This facility will consist of numerous batch chemical processing steps as well as the continuous operation of a joule-heated melter and its off-gas treatment system. A realtime process advisor system based on Artificial Intelligence (AI) techniques has been developed and is currently in use at the semiworks facility, which is operating a 2/3 scale of the DWPF joule-heated melter. The melter advisor system interfaces to the existing data collection and control system and monitors current operations of this facility. The advisor then provides advice to operators and engineers when it identifies process problems. The current system is capable of identifying process problems such as feed system pluggages and thermocouple failures and providing recommended actions. The system also provides facilities normally with distributed control systems. These include the ability to display process flowsheets, monitor alarm conditions, and check the status of process interlocks. 7 figs.

The relationship between speed of information processing and Intelligence Quotient (IQ) was examined in 62 college students using timed paper-and-pencil substitution tests to measure processing speed. A psychometrically better IQ test showed a strong linear relationship between mean time to code and its correlation with IQ; this relationship was…

The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8) and happy expressions were deviant stimuli (p = 0.2), and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8) and fearful expressions were deviant stimuli (p = 0.2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50–130 ms), the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320–450 ms), the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information. PMID:26375031

The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two experimental conditions: a happy condition, in which neutral expressions were standard stimuli (p = 0.8) and happy expressions were deviant stimuli (p = 0.2), and a fearful condition, in which neutral expressions were standard stimuli (p = 0.8) and fearful expressions were deviant stimuli (p = 0.2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were analyzed to index the automatic neural processing of facial expressions. For the early vMMN (50-130 ms), the high IQ group showed more negative vMMN amplitudes than the average IQ group in the happy condition. For the late vMMN (320-450 ms), the high IQ group had greater vMMN responses than the average IQ group over frontal and occipito-temporal areas in the fearful condition, and the average IQ group evoked larger vMMN amplitudes than the high IQ group over occipito-temporal areas in the happy condition. The present study elucidated the close relationships between fluid intelligence and pre-attentive change detection on social-emotional information.

Practical disassembly process planning is extremely important for efficient material recycling and components reuse. The research work for the process planning in literature focuses on the generation of optimal sequences based on the predictive information of products. The used products, unfortunately, exhibit high uncertainty since products may experience very different conditions during their use stage. The indeterminate characteristics associated to used products often makes the predetermined plan unrealistic. Their disassembly process has to be decided dynamically adaptive to the products' specific status. To be able to deal with uncertainty in a dynamic decision making process, this paper presents a fuzzy reasoning Petri net (FRPN) model to represent related decision making rules in disassembly process. Using the proposed fuzzy reasoning algorithm based on the FRPN model, the multicriterion disassembly rules can be considered in the parallel way to make the decision automatically and quickly. Instead of producing the disassembly sequences before disassembling a whole product, the proposed method makes intelligent decisions based on dynamically updated status of components in the product at each disassembly step. Therefore, it is adaptive to the changes that arise during the process. Finally, an example is used to illustrate the application of the proposed methodology.

Through Natural Language Processing (NLP) techniques, information can be extracted from clinical narratives for a variety of applications (e.g., patient management). While the complex and nested output of NLP systems can be expressed in standard formats, such as the eXtensible Markup Language (XML), these representations may not be directly suitable for certain end-users or applications. The availability of a âeuro tabular' format that simplifies the content and structure of NLP output may facilitate the dissemination and use by users who are more familiar with common spreadsheet, database, or statistical tools. In this paper, we describe the knowledge-based design of a tabular representation for NLP output and development of a transformation program for the structured output of MedLEE, an NLP system at our institution. Through an evaluation, we found that the simplified tabular format is comparable to existing more complex NLP formats in effectiveness for identifying clinical conditions in narrative reports.

A model for predicting the intelligibility of processed noisy speech is proposed. The speech-based envelope power spectrum model has a similar structure as the model of Ewert and Dau [(2000). J. Acoust. Soc. Am. 108, 1181-1196], developed to account for modulation detection and masking data. The model estimates the speech-to-noise envelope power ratio, SNR(env), at the output of a modulation filterbank and relates this metric to speech intelligibility using the concept of an ideal observer. Predictions were compared to data on the intelligibility of speech presented in stationary speech-shaped noise. The model was further tested in conditions with noisy speech subjected to reverberation and spectral subtraction. Good agreement between predictions and data was found in all cases. For spectral subtraction, an analysis of the model's internal representation of the stimuli revealed that the predicted decrease of intelligibility was caused by the estimated noise envelope power exceeding that of the speech. The classical concept of the speech transmission index fails in this condition. The results strongly suggest that the signal-to-noise ratio at the output of a modulation frequency selective process provides a key measure of speech intelligibility.

Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

This survey study of 176 participants from eight customer service organizations investigated how individual factors moderate the impact of emotional labor strategies on employee well-being. Hierarchical regression analyses indicated that gender and autonomy were significant moderators of the relationships between emotional labor strategies and the personal outcomes of emotional exhaustion, affective well-being, and job satisfaction. Females were more likely to experience negative consequences when engaging in surface acting. Autonomy served to alleviate negative outcomes for individuals who used emotional labor strategies often. Contrary to our hypotheses, emotional intelligence did not moderate the relationship between the emotional labor strategies and personal outcomes. Results demonstrated how the emotional labor process can influence employee well-being.

The intelligenceprocessing equipment (IPE) research and development (R&D) programs of the Department of Commerce are carried out within the National Institute of Standards and Technology (NIST). This institute has had work in support of industrial productivity as part of its mission since its founding in 1901. With the advent of factory automation these efforts have increasingly turned to R&D in IPE. The Manufacturing Engineering Laboratory (MEL) of NIST devotes a major fraction of its efforts to this end while other elements within the organization, notably the Material Science and Engineering Laboratory, have smaller but significant programs. An inventory of all such programs at NIST and a representative selection of projects that at least demonstrate the scope of the efforts are presented.

Although emotion and cognition were considered to be separate aspects of the psyche in the past, researchers today have demonstrated the existence of an interplay between the two processes. Emotional intelligence (EI), or the ability to perceive, use, understand, and regulate emotions, is a relatively young concept that attempts to connect both emotion and cognition. While EI has been demonstrated to be positively related to well-being, mental and physical health, and non-aggressive behaviors, little is known about its underlying cognitive processes. The aim of the present study was to systematically review available evidence about the relationship between EI and cognitive processes as measured through "cool" (i.e., not emotionally laden) and "hot" (i.e., emotionally laden) laboratory tasks. We searched Scopus and Medline to find relevant articles in Spanish and English, and divided the studies following two variables: cognitive processes (hot vs. cool) and EI instruments used (performance-based ability test, self-report ability test, and self-report mixed test). We identified 26 eligible studies. The results provide a fair amount of evidence that performance-based ability EI (but not self-report EI tests) is positively related with efficiency in hot cognitive tasks. EI, however, does not appear to be related with cool cognitive tasks: neither through self-reporting nor through performance-based ability instruments. These findings suggest that performance-based ability EI could improve individuals' emotional information processing abilities.

Although emotion and cognition were considered to be separate aspects of the psyche in the past, researchers today have demonstrated the existence of an interplay between the two processes. Emotional intelligence (EI), or the ability to perceive, use, understand, and regulate emotions, is a relatively young concept that attempts to connect both emotion and cognition. While EI has been demonstrated to be positively related to well-being, mental and physical health, and non-aggressive behaviors, little is known about its underlying cognitive processes. The aim of the present study was to systematically review available evidence about the relationship between EI and cognitive processes as measured through “cool” (i.e., not emotionally laden) and “hot” (i.e., emotionally laden) laboratory tasks. We searched Scopus and Medline to find relevant articles in Spanish and English, and divided the studies following two variables: cognitive processes (hot vs. cool) and EI instruments used (performance-based ability test, self-report ability test, and self-report mixed test). We identified 26 eligible studies. The results provide a fair amount of evidence that performance-based ability EI (but not self-report EI tests) is positively related with efficiency in hot cognitive tasks. EI, however, does not appear to be related with cool cognitive tasks: neither through self-reporting nor through performance-based ability instruments. These findings suggest that performance-based ability EI could improve individuals’ emotional information processing abilities. PMID:27303277

The objective of the proposed work is to develop an intelligent distributed fiber optical sensor system for real-time monitoring of high temperature in a boiler furnace in power plants. Of particular interest is the estimation of spatial and temporal distributions of high temperatures within a boiler furnace, which will be essential in assessing and controlling the mechanisms that form and remove pollutants at the source, such as NOx. The basic approach in developing the proposed sensor system is three fold: (1) development of high temperature distributed fiber optical sensor capable of measuring temperatures greater than 2000 C degree with spatial resolution of less than 1 cm; (2) development of distributed parameter system (DPS) models to map the three-dimensional (3D) temperature distribution for the furnace; and (3) development of an intelligent monitoring system for real-time monitoring of the 3D boiler temperature distribution. Under Task 1, the efforts focused on developing an innovative high temperature distributed fiber optic sensor by fabricating in-fiber gratings in single crystal sapphire fibers. So far, our major accomplishments include: Successfully grown alumina cladding layers on single crystal sapphire fibers, successfully fabricated in-fiber gratings in single crystal sapphire fibers, and successfully developed a high temperature distributed fiber optic sensor. Under Task 2, the emphasis has been on putting into place a computational capability for simulation of combustors. A PC workstation was acquired with dual Xeon processors and sufficient memory to support 3-D calculations. An existing license for Fluent software was expanded to include two PC processes, where the existing license was for a Unix workstation. Under Task 3, intelligent state estimation theory is being developed which will map the set of 1D (located judiciously within a 3D environment) measurement data into a 3D temperature profile. This theory presents a semigroup

Commonwealth Research Corporation (CRC) and Argonne National Laboratory (ANL) are collaborating on a DOE-sponsored Cooperative Research and Development Agreement (CRADA), project to perform feasibility studies on a novel approach to Artificial Intelligence (Al) based diagnostics for component faults in nuclear power plants. Investigations are being performed in the construction of a first-principles physics-based plant level process diagnostic expert system (ES) and the identification of component-level fault patterns through operating component characteristics using artificial neural networks (ANNs). The purpose of the proof-of-concept project is to develop a computer-based system using this Al approach to assist process plant operators during off-normal plant conditions. The proposed computer-based system will use thermal hydraulic (T-H) signals complemented by other non-T-H signals available in the data stream to provide the process operator with the component which most likely caused the observed process disturbance.To demonstrate the scale-up feasibility of the proposed diagnostic system it is being developed for use with the Chemical Volume Control System (CVCS) of a nuclear power plant. A full-scope operator training simulator representing the Commonwealth Edison Braidwood nuclear power plant is being used both as the source of development data and as the means to evaluate the advantages of the proposed diagnostic system. This is an ongoing multi-year project and this paper presents the results to date of the CRADA phase.

Investigated the Luria-Nebraska Intellectual Processes Scale (IPS) as a predictor of Wechsler Adult Intelligence Scale (WAIS) IQs among alcoholic inpatients. Strong correlations were found between IPS and WAIS Verbal IQ and Full Scale IQ; however, the correlation with Performance IQ was only -.41. (NRB)

This quantitative study investigates teachers' perceptions of how Emotional Intelligence (EI) was utilised by their school principals to manage mandated curriculum change processes in schools in the Johannesburg North district of Gauteng in South Africa. Research shows that EI consists of a range of fundamental skills that could enable school…

Measures of emotional intelligence, vocational exploration, and career decision-making self-efficacy (CDMSE) were completed by 288 college students. Emotional intelligence was positively related to CDMSE. Utilization of feelings and self-control factors were inversely related to vocational exploration and commitment. Gender was not a moderator of…

The present study examined relationships between emotional intelligence, measured by the Mayer-Salovey-Caruso Emotional Intelligence Test, and right hemisphere dominance for a free vision chimeric face test. A sample of 122 ethnically diverse college students participated and completed online versions of the forenamed tests. A hierarchical…

Education in this 21st century is concerned with developing intelligences. Problem solving in real-world contexts involves multiple ways of knowing and learning. Intelligence in the real world involves not only learning how to do things effectively but also more importantly the ability to deal with novelty and growing our capacity to adapt, select…

For a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD) was studied in a continuous bead milling process. A full factorial response surface methodology (RSM) design was employed and compared to artificial neural networks coupled with genetic algorithm (ANN-GA). Significant process variables, cell slurry feed rate (A), bead load (B), cell load (C), and run time (D), were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v), cell loading OD600 nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN-GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h): 258.08, bead loading (%, v/v): 80%, cell loading (OD600 nm): 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN) in combination with evolutionary optimization (GA) for representing undefined biological functions which is the case for common industrial processes involving biological moieties. PMID:27920762

For a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD) was studied in a continuous bead milling process. A full factorial response surface methodology (RSM) design was employed and compared to artificial neural networks coupled with genetic algorithm (ANN-GA). Significant process variables, cell slurry feed rate (A), bead load (B), cell load (C), and run time (D), were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v), cell loading OD600nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN-GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h): 258.08, bead loading (%, v/v): 80%, cell loading (OD600nm): 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN) in combination with evolutionary optimization (GA) for representing undefined biological functions which is the case for common industrial processes involving biological moieties.

In the Senior Candidate Case Writing Seminar, the final component of the Writing as Pedagogy Program at the Columbia Center for Psychoanalytic Training and Research, candidates write about one of their longer training cases, with attention to the ways they use clinical theory, particularly transference and countertransference, to deepen their understanding of psychoanalytic process and therapeutic action. Building on the previous four years of the writing program, this seminar teaches advanced candidates to recognize and integrate the lived experience of conducting an analysis, the micro- and macroprocesses, and the theories most relevant to an understanding of the analytic work. The seminar emphasizes the challenge of dealing with the power of the transference, unrecognized or unacknowledged countertransference, and the nature of therapeutic action. Pedagogical emphasis is placed on peer group discussions and group learning, and common problems in integrating theory and practice are described and illustrated.

A rule-based computer system was developed to perform clinical decision-making support within a medical information system, oncology practice, and clinical research. This rule-based system, which has been programmed using deterministic rules, possesses features of generalizability, modularity of structure, convenience in rule acquisition, explanability, and utility for patient care and teaching, features which have been identified as advantages of artificial intelligence (AI) rule-based systems. Formal rules are primarily represented as conditional statements; common conditions and actions are stored in system dictionaries so that they can be recalled at any time to form new decision rules. Important similarities and differences exist in the structure of this system and clinical computer systems utilizing artificial intelligence (AI) production rule techniques. The non-AI rule-based system posesses advantages in cost and ease of implementation. The degree to which significant medical decision problems can be solved by this technique remains uncertain as does whether the more complex AI methodologies will be required. 15 references.

We investigated possible pathways into mental illness via the combined effects of trait emotional intelligence (trait EI), mindfulness, and irrational beliefs. The sample comprised 121 psychiatric outpatients (64.5% males, mean age = 38.8 years) with a variety of formal clinical diagnoses. Psychopathology was operationalized by means of 3 distinct indicators from the Millon Clinical Multi-Axial Inventory (mild pathology, severe pathology, and clinical symptomatology). A structural equation model confirmed significant direct trait EI and mindfulness effects on irrational beliefs and psychopathology. Trait EI also had a significant indirect effect on psychopathology via mindfulness. Together, the 3 constructs accounted for 44% of the variance in psychopathology. A series of hierarchical regressions demonstrated that trait EI is a stronger predictor of psychopathology than mindfulness and irrational beliefs combined. We conclude that the identified pathways can provide the basis for the development of safe and effective responses to the ongoing mental health and overmedication crises.

Over the past decade, it has become increasingly clear that although IQ and technical skills are important, emotional intelligence is the Sine Qua Non of leadership. According to Goleman [Goleman, D. (1998). What makes a leader? "Harvard Business Review," 93-102] "effective leaders are alike in one crucial way: they all have a high degree of…

Our reflection, on the elements of existing Image Processing systems (currently Image Processing, Symbol interpretation level, control mode, level of extracted features) and corresponding use of Artificial Intelligence, leads us to the definition of the SARPI system. This system performs the extraction of features of intermediate level. In the present first step of implementation, we limit ourself to line segments. They are associated to a descriptor including several parameters: position, angle, length, cross contrast, ... and precision on all of these parameters. SARPI applies to single or multiple features detection, it finds the requested feature(s) and produces its (their) total or partial (as requested) description. SARPI takes as input the set of requested parameters and available values of some feature parameters (typically: qualitative measure of contrast). Its main part is a control module automatically generating an Image Processing sequence to solve the problem (extraction of requested feature parameters). Rules allow to divide the problem in elementary ones with respect to the kind of input parameters. They allow the selection of an elementary function set according to the requested feature parameters and the known parameters; in this way, if the known information is insufficient, the control module selects and executes elementary functions that look for the missing information. Each of these elementary functions is pre-associated to Image Procedures and heuristics that select the appropriate procedures according to thc values of the input parameters. The parameters of the image processes are controlled automatically by the precision on the requested feature parameters. Particularly, the sampling steps of the parameters ρ and θ of the 'lough transform are calculated from the requested precision of the feature parameters. The selected Image Processings are applied on a region of the image that is calculated from the approximated position of the

Today's imaging diagnosis needs to adapt modern techniques of quality engineering to maintain and improve its accuracy and reliability in health care system. One of the main factors that influences diagnostic accuracy of plain film X-ray on detecting pathology is the level of film exposure. If the level of film exposure is not adequate, a normal body structure may be interpretated as pathology and vice versa. This not only influences the patient management but also has an impact on health care cost and patient's quality of life. Therefore, providing an accurate and high quality image is the first step toward an excellent patient management in any health care system. In this paper, we study these techniques and also present a fuzzy intelligent quality monitoring model, which can be used to keep variables from degrading the image quality. The variables derived from chemical activity, cleaning procedures, maintenance, and monitoring may not be sensed, measured, or calculated precisely due to uncertain situations. Therefore, the gamma-level fuzzy Bayesian model for quality monitoring of an image processing is proposed. In order to apply the Bayesian concept, the fuzzy quality characteristics are assumed as fuzzy random variables. Using the fuzzy quality characteristics, the newly developed model calculates the degradation risk for image processing. A numerical example is also presented to demonstrate the application of the model.

Holographic PIV (HPIV) is a promising 3D velocity field measurement technique providing high spatial-temporal resolution needed for understanding complex and turbulent flows. An HPIV system, combining in-line recording and off-axis viewing (IROV) holography and Heuristic Morphology Particle Pairing (HMPP) method, is being developed in this work. Unlike 2D PIV, HPIV instantaneously records a volume of particle images through holographic imaging. Its data processing involves special difficulties such as speckle noise, sparse pairs and large data sets. The HMPP algorithm is an adaptive parallel processing scheme applying artificial intelligence searching theory. Based on similar morphology of a particle group at successive instants separated by a small interval, HMPP matches a group of particle images between double exposures and provides velocity vectors for individual particle pairs, providing much higher spatial resolution than conventional correlation algorithm and lower measurement error caused by large velocity gradients. Taking advantages of IROV and HMPP, the system being developed appears highly promising as a practical HPIV configuration.

In an effort to strengthen patient safety, leadership at the University of Kentucky HealthCare (UKHC) decided to replace its traditional approach to root cause analysis (RCA) with a process based on swarm intelligence, a concept borrowed from other industries. Under this process, when a problem or error is identified, staff quickly hold a swarm--a meeting in which all those involved in the incident or problem quickly evaluate why the issue occurred and identify potential solutions for implementation. A pillar of the swarm concept is a mandate that there be no punishments or finger-pointing during the swarms. The idea is to encourage staff to be forthcoming to achieve effective solutions. Typically, swarms last for one hour and result in action plans designed to correct problems or deficiencies within a specific period of time. The ED was one of the first areas where UKHC applied swarms. For example, hospital administrators note that the approach has been used to address issues involving patient flow, triage protocols, assessments, overcrowding, and boarding. After seven years, incident reporting at UKHC has increased by 52%, and the health system has experienced a 37% decrease in the observed-to-expected mortality ratio.

With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.

An eddy current sensor system has been developed to monitor densification during hot isostatic pressing (HIP) of Spraycast-X{reg_sign} superalloy components for aerospace applications. The sensor system was designed, implemented and demonstrated by MATSYS personnel at the Howmet Corporation HIP facility. The eddy current sensor was used to monitor densification of Spraycast-X{reg_sign} Rene`41 ring segments from 95 to 98 percent relative density to full density. The sensor data were verified and validated by metallographic examinations of HIPed specimens. The grain size of the Spraycast-X{reg_sign} Rene`41 was not affected by HIP at both 1,066 C (1,950 F) and 1,121 C (2,050 F). Tensile strengths and 0.2% creep rupture properties were not sensitive to changes in HIP processing conditions. However, tensile ductilities and low cycle properties showed a strong correlation to HIP time at 1,121 C/103 MPa (2,050 F/15 KSI). As hole time at maximum temperature and pressure was increased from 1 to 4 hours, tensile ductilities and low cycle fatigue lives increased. The sensor system can be integrated with an intelligent closed loop control system to monitor and control densification rate and shape distortion.

Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods. PMID:24883365

Steam distillation as one of the important mechanisms has a great role in oil recovery in thermal methods and so it is important to simulate this process experimentally and theoretically. In this work, the simulation of steam distillation is performed on sixteen sets of crude oil data found in the literature. Artificial intelligence (AI) tools such as artificial neural network (ANN) and also adaptive neurofuzzy interference system (ANFIS) are used in this study as effective methods to simulate the distillate recoveries of these sets of data. Thirteen sets of data were used to train the models and three sets were used to test the models. The developed models are highly compatible with respect to input oil properties and can predict the distillate yield with minimum entry. For showing the performance of the proposed models, simulation of steam distillation is also done using modified Peng-Robinson equation of state. Comparison between the calculated distillates by ANFIS and neural network models and also equation of state-based method indicates that the errors of the ANFIS model for training data and test data sets are lower than those of other methods.

SummaryThe need for accurate modeling of the rainfall-runoff process has grown rapidly in the past decades. However, considering the high stochastic property of the process, many models are still being developed in order to define such a complex phenomenon. Recently, Artificial Intelligence (AI) techniques such as the Artificial Neural Network (ANN) and the Adaptive Neural-Fuzzy Inference System (ANFIS) have been extensively used by hydrologists for rainfall-runoff modeling as well as for other fields of hydrology. In this paper, two hybrid AI-based models which are reliable in capturing the periodicity features of the process are introduced for watershed rainfall-runoff modeling. In the first model, the SARIMAX (Seasonal Auto Regressive Integrated Moving Average with exogenous input)-ANN model, an ANN is used to find the non-linear relationship among the residuals of the fitted linear SARIMAX model. In the second model, the wavelet-ANFIS model, wavelet transform is linked to the ANFIS concept and the main time series of two variables (rainfall and runoff) are decomposed into some multi-frequency time series by wavelet transform. Afterwards, these time series are imposed as input data to the ANFIS to predict the runoff discharge one time step ahead. The obtained results of the models applications for the rainfall-runoff modeling of two watersheds (located in Azerbaijan, Iran) show that, although the proposed models can predict both short and long terms runoff discharges by considering seasonality effects, the second model is relatively more appropriate because it uses the multi-scale time series of rainfall and runoff data in the ANFIS input layer.

The purpose of this study was to analyze selected cognitive theories in the areas of artificial intelligence (A.I.) and psychology to determine the role of emotions in the cognitive or intellectual processes. Understanding the relationship of emotions to processes of intelligence has implications for constructing theories of aesthetic response and A.I. systems in art. Psychological theories were examined that demonstrated the changing nature of the research in emotion related to cognition. The basic techniques in A.I. were reviewed and the A.I. research was analyzed to determine the process of cognition and the role of emotion. The A.I. research emphasized the digital, quantifiable character of the computer and associated cognitive models and programs. In conclusion, the cognitive-emotive research in psychology and the cognitive research in A.I. emphasized quantification methods over analog and qualitative characteristics required for a holistic explanation of cognition. Further A.I. research needs to examine the qualitative aspects of values, attitudes, and beliefs on influencing the creative thinking processes. Inclusion of research related to qualitative problem solving in art provides a more comprehensive base of study for examining the area of intelligence in computers.

New models in clinical management seek a clinical practice based on quality, efficacy and efficiency, avoiding variability and improvisation. In this paper we have developed one of the most frequent clinicalprocesses in our speciality, the process based on DRG 311 or transurethral procedures without complications. Along it we will describe its components: Stabilization form, clinical trajectory, cost calculation, and finally the process flowchart.

Open source is a still unexploited chance for healthcare organizations and technology providers to answer to a growing demand for innovation and to join economical benefits with a new way of managing hospital information systems. This chapter will present the case of the web enterprise clinical portal developed in Italy by Niguarda Hospital in Milan with the support of Fondazione Politecnico di Milano, to enable a paperless environment for clinical and administrative activities in the ward. This represents also one rare case of open source technology and reuse in the healthcare sector, as the system's porting is now taking place at Besta Neurological Institute in Milan. This institute is customizing the portal to feed researchers with structured clinical data collected in its portal's patient records, so that they can be analyzed, e.g., through business intelligence tools. Both organizational and clinical advantages are investigated, from process monitoring, to semantic data structuring, to recognition of common patterns in care processes.

Current research on strategies of controlling product properties during processing is examined in reviews and reports. Problems discussed include predictive modeling, advanced sensing, and intelligent control. Particular attention is given to design and manufacturing of advanced materials and structures, intelligent control of carbon-carbon pyrolysis, computer simulation of crystal growth, modeling of phase change phenomena in boundary fitted coordinates, applications of optimization modeling techniques to materials processing, the effect of carbonization kinetics on in-process mechanical properties, nondestructive characterization and strength of solid-solid bond, and acoustic emission and ultrasonic sensing. Consideration is also given to collective learning systems for automatic control, a multicomponent knowledge base for spray casting process control, optimal control of microstructure during near-net shape processing, intelligent control of arc welding, and an intelligent control architecture for carbonization science.

Throughout the history of the artificial intelligence movement, researchers have strived to create computers that could simulate general human intelligence. This paper argues that workers in artificial intelligence have failed to achieve this goal because they adopted the wrong model of human behavior and intelligence, namely a cognitive essentialist model with origins in the traditional philosophies of natural intelligence. An analysis of the word “intelligence” suggests that it originally referred to behavior-environment relations and not to inferred internal structures and processes. It is concluded that if workers in artificial intelligence are to succeed in their general goal, then they must design machines that are adaptive, that is, that can learn. Thus, artificial intelligence researchers must discard their essentialist model of natural intelligence and adopt a selectionist model instead. Such a strategic change should lead them to the science of behavior analysis. PMID:22477051

Nowadays, there are molecular biology techniques providing information related to cervical cancer and its cause: the human Papillomavirus (HPV), including DNA microarrays identifying HPV subtypes, mRNA techniques such as nucleic acid based amplification or flow cytometry identifying E6/E7 oncogenes, and immunocytochemistry techniques such as overexpression of p16. Each one of these techniques has its own performance, limitations and advantages, thus a combinatorial approach via computational intelligence methods could exploit the benefits of each method and produce more accurate results. In this article we propose a clinical decision support system (CDSS), composed by artificial neural networks, intelligently combining the results of classic and ancillary techniques for diagnostic accuracy improvement. We evaluated this method on 740 cases with complete series of cytological assessment, molecular tests, and colposcopy examination. The CDSS demonstrated high sensitivity (89.4%), high specificity (97.1%), high positive predictive value (89.4%), and high negative predictive value (97.1%), for detecting cervical intraepithelial neoplasia grade 2 or worse (CIN2+). In comparison to the tests involved in this study and their combinations, the CDSS produced the most balanced results in terms of sensitivity, specificity, PPV, and NPV. The proposed system may reduce the referral rate for colposcopy and guide personalised management and therapeutic interventions. PMID:24812614

The aim of this study was to assess the relative importance of cochlear mechanical dysfunction, temporal processing deficits, and age on the ability of hearing-impaired listeners to understand speech in noisy backgrounds. Sixty-eight listeners took part in the study. They were provided with linear, frequency-specific amplification to compensate for their audiometric losses, and intelligibility was assessed for speech-shaped noise (SSN) and a time-reversed two-talker masker (R2TM). Behavioral estimates of cochlear gain loss and residual compression were available from a previous study and were used as indicators of cochlear mechanical dysfunction. Temporal processing abilities were assessed using frequency modulation detection thresholds. Age, audiometric thresholds, and the difference between audiometric threshold and cochlear gain loss were also included in the analyses. Stepwise multiple linear regression models were used to assess the relative importance of the various factors for intelligibility. Results showed that (a) cochlear gain loss was unrelated to intelligibility, (b) residual cochlear compression was related to intelligibility in SSN but not in a R2TM, (c) temporal processing was strongly related to intelligibility in a R2TM and much less so in SSN, and (d) age per se impaired intelligibility. In summary, all factors affected intelligibility, but their relative importance varied across maskers. PMID:27604779

The general operation of KATE, an artificial intelligence controller, is outlined. A shuttle environmental control system (ECS) demonstration system for KATE is explained. The knowledge base model for this system is derived. An experimental test procedure is given to verify parameters in the model.

Compares the performances on psychometric and Piagetian measures of intelligence at school entry of children of different ethnic groups living within a dominant Western-type culture. In general, few significant differences between the ethnic groups are found. (Author/AM)

the FISU Budget ..................................................................71 Figure 8. The Evolution of the MDI Budget...high-tech equipment, economic, political, social , and military analysts, as well as recruited agents around the world. As it was announced in...Information Operations Groups (FIOG);38 and in Australia, it is the Defense Signal Directorate (DSD).39 In an effort to make national intelligence

This research examined if mothers' day-to-day praise of children's success in school plays a role in children's theory of intelligence and motivation. Participants were 120 children (mean age = 10.23 years) and their mothers who took part in a 2-wave study spanning 6 months. During the first wave, mothers completed a 10-day daily interview in…

The relation between general intelligence (psychometric "g") and temporal resolution capacity of the central nervous system was examined by assessing performance on eight different temporal tasks in a sample of 100 participants. Correlational and principal component analyses suggested a unitary timing mechanism, referred to as temporal "g".…

significant equipment refurbishing needed for installation of multiple separate ellipsometric systems, and development of customized software to control all of them simultaneously. The proposed optical monitoring system comprises AccuStrata’s fiber optics sensors installed inside the thin film deposition equipment, a hardware module of different components (beyond the scope of this project) and our software program with iterative predicting capability able to control material bandgap and surface roughness as films are deposited. Our miniature fiber optics monitoring sensors are installed inside the vacuum chamber compartments in very close proximity where the independent layers are deposited (an option patented by us in 2003). The optical monitoring system measures two of the most important parameters of the photovoltaic thin films during deposition on a moving solar panel - material bandgap and surface roughness. In this program each sensor array consists of two fiber optics sensors monitoring two independent areas of the panel under deposition. Based on the monitored parameters and their change in time and from position to position on the panel, the system is able to provide to the equipment operator immediate information about the thin films as they are deposited. This DoE Supply Chain program is considered the first step towards the development of intelligent optical control system capable of dynamically adjusting the manufacturing process “on-the-fly” in order to achieve better performance. The proposed system will improve the thin film solar cell manufacturing by improving the quality of the individual solar cells and will allow for the manufacturing of more consistent and uniform products resulting in higher solar conversion efficiency and manufacturing yield. It will have a significant impact on the multibillion-dollar thin film solar market. We estimate that the financial impact of these improvements if adopted by only 10% of the industry ($7.7 Billion) would

The overall objective of this project is to develop sensor-integrated `intelligent` diamond wheels for grinding ceramics. Such wheels will be `smart` enough to monitor and supervise both the wheel preparation and grinding processes without the need to instrument the machine tool. Intelligent wheels will utilize re-useable cores integrated with two types of sensors: acoustic emission (AE) and dynamic force transducers. Signals from the sensors will be transmitted from a rotating wheel to a receiver by telemetry. Intelligent wheels will be `trained` to recognize distinct characteristics associated with truing, dressing and grinding.

A comprehensive evaluation of eight signal pre-processing strategies, including directional microphones, coherence filters, single-channel noise reduction, binaural beamformers, and their combinations, was undertaken with normal-hearing (NH) and hearing-impaired (HI) listeners. Speech reception thresholds (SRTs) were measured in three noise scenarios (multitalker babble, cafeteria noise, and single competing talker). Predictions of three common instrumental measures were compared with the general perceptual benefit caused by the algorithms. The individual SRTs measured without pre-processing and individual benefits were objectively estimated using the binaural speech intelligibility model. Ten listeners with NH and 12 HI listeners participated. The participants varied in age and pure-tone threshold levels. Although HI listeners required a better signal-to-noise ratio to obtain 50% intelligibility than listeners with NH, no differences in SRT benefit from the different algorithms were found between the two groups. With the exception of single-channel noise reduction, all algorithms showed an improvement in SRT of between 2.1 dB (in cafeteria noise) and 4.8 dB (in single competing talker condition). Model predictions with binaural speech intelligibility model explained 83% of the measured variance of the individual SRTs in the no pre-processing condition. Regarding the benefit from the algorithms, the instrumental measures were not able to predict the perceptual data in all tested noise conditions. The comparable benefit observed for both groups suggests a possible application of noise reduction schemes for listeners with different hearing status. Although the model can predict the individual SRTs without pre-processing, further development is necessary to predict the benefits obtained from the algorithms at an individual level.

The purpose of this exploratory qualitative study was to determine the reasoning processes used by paramedics to solve clinical problems. Existing research documents concern over the accuracy of paramedics' clinical decision-making, but no research was found that examines the cognitive processes by which paramedics make either faulty or accurate…

This dissertation has addressed the broad hypothesis as to whether building mathematical models is useful as a tool for translating physiological knowledge into clinical practice. In doing so it describes work on the INtelligent VENTilator project (INVENT), the goal of which is to build, evaluate and integrate into clinical practice, a model-based decision support system for control of mechanical ventilation. The dissertation describes the mathematical models included in INVENT, i.e. a model of pulmonary gas exchange focusing on oxygen transport, and a model of the acid-base status of blood, interstitial fluid and tissues. These models have been validated, and applied in two other systems: ALPE, a system for measuring pulmonary gas exchange and ARTY, a system for arterialisation of the acid-base and oxygen status of peripheral venous blood. The major contributions of this work are as follows. A mathematical model has been developed which can describe pulmonary gas exchange more accurately that current clinical techniques. This model is parsimonious in that it can describe pulmonary gas exchange from measurements easily available in the clinic, along with a readily automatable variation in F(I)O(2). This technique and model have been developed into a research and commercial tool (ALPE), and evaluated both in the clinical setting and when compared to the reference multiple inert gas elimination technique (MIGET). Mathematical models have been developed of the acid- base chemistry of blood, interstitial fluid and tissues, with these models formulated using a mass-action mass-balance approach. The model of blood has been validated against literature data describing the addition and removal of CO(2), strong acid or base, and haemoglobin; and the effects of oxygenation or deoxygenation. The model has also been validated in new studies, and shown to simulate accurately and precisely the mixing of blood samples at different PCO(2) and PO(2) levels. This model of acid

Despite the accelerating pace of scientific discovery, the current clinical research enterprise does not sufficiently address pressing clinical questions. Given the constraints on clinical trials, for a majority of clinical questions, the only relevant data available to aid in decision making are based on observation and experience. Our purpose here is 3-fold. First, we describe the classic context of medical research guided by Poppers' scientific epistemology of "falsificationism." Second, we discuss challenges and shortcomings of randomized controlled trials and present the potential of observational studies based on big data. Third, we cover several obstacles related to the use of observational (retrospective) data in clinical studies. We conclude that randomized controlled trials are not at risk for extinction, but innovations in statistics, machine learning, and big data analytics may generate a completely new ecosystem for exploration and validation.

Despite the accelerating pace of scientific discovery, the current clinical research enterprise does not sufficiently address pressing clinical questions. Given the constraints on clinical trials, for a majority of clinical questions, the only relevant data available to aid in decision making are based on observation and experience. Our purpose here is 3-fold. First, we describe the classic context of medical research guided by Poppers’ scientific epistemology of “falsificationism.” Second, we discuss challenges and shortcomings of randomized controlled trials and present the potential of observational studies based on big data. Third, we cover several obstacles related to the use of observational (retrospective) data in clinical studies. We conclude that randomized controlled trials are not at risk for extinction, but innovations in statistics, machine learning, and big data analytics may generate a completely new ecosystem for exploration and validation. PMID:27383622

and emoticons . In addition to its unstructured nature, the sheer volume of social media communications sent during events of interest is overwhelming...time information contributed by large communities of social media users can be researched and searched for sources of information relevant to...national security. There is increasing acceptance within the intelligence community of the potential value of social media, but understandably some

The major philosophical issues surrounding the concept of intelligence are reviewed with respect to the problems surrounding the process of defining and developing artificial intelligence (AI) in computers. Various current definitions and problems with these definitions are presented. (MP)

A number of research communities recognize Artificial Intelligence (AI) as a valid reference discipline. However, several papers have criticized AI`s research methodologies. This paper attempts to clarify and improve the methods used in AI. Definitions are proposed for terms such as AI theory, principles, hypotheses, and observations. Next, a unified view of AI research methodology is proposed. This methodology contains a long term dimension based upon the scientific method and an individual project dimension. The individual project dimension identifies four strategies: Hypothetical/deductive, hermeneutical/inductive, case-based, and historical analysis. The strategies differ according to how prototyping is used in an experiment. 78 refs.

and treatment of detainees are quite explicit and form the basis for Chapters three and five in FM 27-10. Furthermore, according to Dworkin , the U.S...stated by Dworkin , “The war in Iraq was covered by the Geneva Conventions, and the United States has accepted that all Iraqi prisoners were either POWs...Intelligence Support to Military Operations, Joint Pub 2-01 (Washington, D.C.: U.S. Joint Chiefs of Staff, 7 October 2004), G-4. 25 Anthony Dworkin

two or more) speakers. These readings were recorded on audiotape and then digitized at 10 kHz, 16 bits/sample. The a.. input test data was generated...signal. The result that the Sl output is intelligible is expected because exciting the desired speech envelope with only random noise is known to...1015, 1982. "Methods for the Calculation of the Articulation Index", .S3.5, 1969. C.I. Berlin & M.R. McNeil, " Dichotic Listening", in oIssues in1= j

This study aims to present the re-signification process of the meanings of nurses' clinical practice in primary care from the perspective of extended clinic and permanent education. An intervention research was carried out with the approval of an ethics committee. Nine nurses participated in reflection groups from September to December 2008 in Ribeirão Preto-SP-Brazil. The redefinition process of the meanings proposed by the institutional analysis was mapped. The results point out that the nurses perceive differences in clinical work, by acknowledging the sense of user-centered clinical practice; daily limits and tensions and the need for support from managers and the team to deal with users' problems and situations. They identify the necessity to open space in the schedule to do that. It was concluded that nurses' clinical practice is being consolidated, and that collective analysis processes permit learning and the reconstruction of practices.

Recent theories on how listeners maintain perceptual invariance despite variation in the speech signal allocate a prominent role to imitation mechanisms. Notably, these simulation accounts propose that motor mechanisms support perception of ambiguous or noisy signals. Indeed, imitation of ambiguous signals, e.g., accented speech, has been found to aid effective speech comprehension. Here, we explored the possibility that imitation in speech benefits perception by increasing activation in speech perception and production areas. Participants rated the intelligibility of sentences spoken in an unfamiliar accent of Dutch in a functional Magnetic Resonance Imaging experiment. Next, participants in one group repeated the sentences in their own accent, while a second group vocally imitated the accent. Finally, both groups rated the intelligibility of accented sentences in a post-test. The neuroimaging results showed an interaction between type of training and pre- and post-test sessions in left Inferior Frontal Gyrus, Supplementary Motor Area, and left Superior Temporal Sulcus. Although alternative explanations such as task engagement and fatigue need to be considered as well, the results suggest that imitation may aid effective speech comprehension by supporting sensorimotor integration. PMID:24109447

We describe the design of a reliable, user-friendly preprototype system for quantifying the tendon stretch reflexes in humans and large mammals. A hand-held, instrumented reflex gun, the impactor of which contains a single force sensor, interfaces with a computer. The resulting test system can deliver sequences of reproducible stimuli at graded intensities and adjustable durations to a muscle's tendon ("tendon taps"), measure the impacting force of each tap, and record the subsequent reflex muscle contraction from the same tendon -- all automatically. The parameters of the reflex muscle contraction include latency; mechanical threshold; and peak time, peak magnitude, and settling time. The results of clinical tests presented in this paper illustrate the system's potential usefulness in detecting neurologic dysfunction affecting the tendon stretch reflexes, in documenting the course of neurologic illnesses and their response to therapy, and in clinical and laboratory neurologic research.

Nowadays large databases of clinicalprocess data exist in hospitals. However, these data are rarely used in full scope. In order to perform queries on hospital processes, one must either choose from the predefined queries or develop queries using MS Excel-type software system, which is not always a trivial task. In this paper we propose a new query language for analyzing clinicalprocesses that is easily perceptible also by non-IT professionals. We develop this language based on a process modeling language which is also described in this paper. Prototypes of both languages have already been verified using real examples from hospitals.

Managing anemia in hemodialysis patients can be challenging because of competing therapeutic targets and individual variability. Because therapy recommendations provided by a decision support system can benefit both patients and doctors, we evaluated the impact of an artificial intelligence decision support system, the Anemia Control Model (ACM), on anemia outcomes. Based on patient profiles, the ACM was built to recommend suitable erythropoietic-stimulating agent doses. Our retrospective study consisted of a 12-month control phase (standard anemia care), followed by a 12-month observation phase (ACM-guided care) encompassing 752 patients undergoing hemodialysis therapy in 3 NephroCare clinics located in separate countries. The percentage of hemoglobin values on target, the median darbepoetin dose, and individual hemoglobin fluctuation (estimated from the intrapatient hemoglobin standard deviation) were deemed primary outcomes. In the observation phase, median darbepoetin consumption significantly decreased from 0.63 to 0.46 μg/kg/month, whereas on-target hemoglobin values significantly increased from 70.6% to 76.6%, reaching 83.2% when the ACM suggestions were implemented. Moreover, ACM introduction led to a significant decrease in hemoglobin fluctuation (intrapatient standard deviation decreased from 0.95 g/dl to 0.83 g/dl). Thus, ACM support helped improve anemia outcomes of hemodialysis patients, minimizing erythropoietic-stimulating agent use with the potential to reduce the cost of treatment.

Process mining is an emerging technology in the context of Business Process Management with the goal to derive process models from observed system behavior. The global goals are: to detect previously unknown process structures, to implement consistent process controlling which may involve computation of realistic cycle times and frequency of occurrence of process pathways, or to quantify the conformance to guidelines. We did a detailed hands-on evaluation and analysis of established process-mining approaches and assessed their abilities to cope with the challenges of clinical environments. None of the examined 7 approaches fulfilled all requirements, but 2 could be circle out, which are to some degree suitable for clinicalprocess mining.

The objective of the proposed work is to develop an intelligent distributed fiber optical sensor system for real-time monitoring of high temperature in a boiler furnace in power plants. Of particular interest is the estimation of spatial and temporal distributions of high temperatures within a boiler furnace, which will be essential in assessing and controlling the mechanisms that form and remove pollutants at the source, such as NOx. The basic approach in developing the proposed sensor system is three fold: (1) development of high temperature distributed fiber optical sensor capable of measuring temperatures greater than 2000 C degree with spatial resolution of less than 1 cm; (2) development of distributed parameter system (DPS) models to map the three-dimensional (3D) temperature distribution for the furnace; and (3) development of an intelligent monitoring system for real-time monitoring of the 3D boiler temperature distribution. Under Task 1, we have set up a dedicated high power, ultrafast laser system for fabricating in-fiber gratings in harsh environment optical fibers, successfully fabricated gratings in single crystal sapphire fibers by the high power laser system, and developed highly sensitive long period gratings (lpg) by electric arc. Under Task 2, relevant mathematical modeling studies of NOx formation in practical combustors have been completed. Studies show that in boiler systems with no swirl, the distributed temperature sensor may provide information sufficient to predict trends of NOx at the boiler exit. Under Task 3, we have investigated a mathematical approach to extrapolation of the temperature distribution within a power plant boiler facility, using a combination of a modified neural network architecture and semigroup theory. Given a set of empirical data with no analytic expression, we first developed an analytic description and then extended that model along a single axis.

The objective of the proposed work is to develop an intelligent distributed fiber optical sensor system for real-time monitoring of high temperature in a boiler furnace in power plants. Of particular interest is the estimation of spatial and temporal distributions of high temperatures within a boiler furnace, which will be essential in assessing and controlling the mechanisms that form and remove pollutants at the source, such as NOx. The basic approach in developing the proposed sensor system is three fold: (1) development of high temperature distributed fiber optical sensor capable of measuring temperatures greater than 2000 C degree with spatial resolution of less than 1 cm; (2) development of distributed parameter system (DPS) models to map the three-dimensional (3D) temperature distribution for the furnace; and (3) development of an intelligent monitoring system for real-time monitoring of the 3D boiler temperature distribution. Under Task 1, we set up a dedicated high power, ultrafast laser system for fabricating in-fiber gratings in harsh environment optical fibers, successfully fabricated gratings in single crystal sapphire fibers by the high power laser system, and developed highly sensitive long period gratings (lpg) by electric arc. Under Task 2, relevant mathematical modeling studies of NOx formation in practical combustors. Studies show that in boiler systems with no swirl, the distributed temperature sensor may provide information sufficient to predict trends of NOx at the boiler exit. Under Task 3, we investigate a mathematical approach to extrapolation of the temperature distribution within a power plant boiler facility, using a combination of a modified neural network architecture and semigroup theory. The 3D temperature data is furnished by the Penn State Energy Institute using FLUENT. Given a set of empirical data with no analytic expression, we first develop an analytic description and then extend that model along a single axis. Extrapolation

The objective of the proposed work is to develop an intelligent distributed fiber optical sensor system for real-time monitoring of high temperature in a boiler furnace in power plants. Of particular interest is the estimation of spatial and temporal distributions of high temperatures within a boiler furnace, which will be essential in assessing and controlling the mechanisms that form and remove pollutants at the source, such as NOx. The basic approach in developing the proposed sensor system is three fold: (1) development of high temperature distributed fiber optical sensor capable of measuring temperatures greater than 2000 C degree with spatial resolution of less than 1 cm; (2) development of distributed parameter system (DPS) models to map the three-dimensional (3D) temperature distribution for the furnace; and (3) development of an intelligent monitoring system for real-time monitoring of the 3D boiler temperature distribution. Under Task 1, improvement was made on the performance of in-fiber grating fabricated in single crystal sapphire fibers, test was performed on the grating performance of single crystal sapphire fiber with new fabrication methods, and the fabricated grating was applied to high temperature sensor. Under Task 2, models obtained from 3-D modeling of the Demonstration Boiler were used to study relationships between temperature and NOx, as the multi-dimensionality of such systems are most comparable with real-life boiler systems. Studies show that in boiler systems with no swirl, the distributed temperature sensor may provide information sufficient to predict trends of NOx at the boiler exit. Under Task 3, we investigate a mathematical approach to extrapolation of the temperature distribution within a power plant boiler facility, using a combination of a modified neural network architecture and semigroup theory. The 3D temperature data is furnished by the Penn State Energy Institute using FLUENT. Given a set of empirical data with no analytic

The last century saw the application of Boolean algebra toward the construction of computing machines, which work by applying logical transformations to information contained in their memory. The development of information theory and the generalization of Boolean algebra to Bayesian inference have enabled these computing machines. in the last quarter of the twentieth century, to be endowed with the ability to learn by making inferences from data. This revolution is just beginning as new computational techniques continue to make difficult problems more accessible. However, modern intelligent machines work by inferring knowledge using only their pre-programmed prior knowledge and the data provided. They lack the ability to ask questions, or request data that would aid their inferences. Recent advances in understanding the foundations of probability theory have revealed implications for areas other than logic. Of relevance to intelligent machines, we identified the algebra of questions as the free distributive algebra, which now allows us to work with questions in a way analogous to that which Boolean algebra enables us to work with logical statements. In this paper we describe this logic of inference and inquiry using the mathematics of partially ordered sets and the scaffolding of lattice theory, discuss the far-reaching implications of the methodology, and demonstrate its application with current examples in machine learning. Automation of both inference and inquiry promises to allow robots to perform science in the far reaches of our solar system and in other star systems by enabling them to not only make inferences from data, but also decide which question to ask, experiment to perform, or measurement to take given what they have learned and what they are designed to understand.

NASA also seeks to advance American education by employing the technology utilization process to develop a computerized, artificial intelligence-based Intelligent Tutoring System (ITS) to help high school and college physics students. The tutoring system is designed for use with the lecture and laboratory portions of a typical physics instructional program. Its importance lies in its ability to observe continually as a student develops problem solutions and to intervene when appropriate with assistance specifically directed at the student's difficulty and tailored to his skill level and learning style. ITS originated as a project of the Johnson Space Center (JSC). It is being developed by JSC's Software Technology Branch in cooperation with Dr. R. Bowen Loftin at the University of Houston-Downtown. Program is jointly sponsored by NASA and ACOT (Apple Classrooms of Tomorrow). Other organizations providing support include Texas Higher Education Coordinating Board, the National Research Council, Pennzoil Products Company and the George R. Brown Foundation. The Physics I class of Clear Creek High School, League City, Texas are providing the classroom environment for test and evaluation of the system. The ITS is a spinoff product developed earlier to integrate artificial intelligence into training/tutoring systems for NASA astronauts flight controllers and engineers.

Dental ethics is often taught, viewed, and conducted as an intell enterprise, uninformed by other noncognitive factors. Emotional intelligence (EQ) is defined distinguished from the cognitive intelligence measured by Intelligence Quotient (IQ). This essay recommends more inclusion of emotional, noncognitive input to the ethical decision process in dental education and dental practice.

Income (activity) and expenditure (costs) form the basis of a modern hospital's 'business intelligence'. However, clinical engagement in business intelligence is patchy. This article describes the principles of business intelligence and outlines some recent developments using web-based applications.

Background Characterizing neuropsychological (NP) functioning of individuals at clinical high risk (CHR) for psychosis may be useful for prediction of psychosis and understanding functional outcome. The degree to which NP impairments are associated with general cognitive ability and/or later emergence of full psychosis in CHR samples requires study with well-matched controls. Methods We assessed NP functioning across eight cognitive domains in a sample of 73 CHR youth, 13 of whom developed psychotic-level symptoms after baseline assessment, and 34 healthy comparison (HC) subjects. Groups were matched on age, sex, ethnicity, handedness, subject and parent grade attainment, and median family income, and were comparable on WRAT-3 Reading, an estimate of premorbid IQ. Profile analysis was used to examine group differences and the role of IQ in profile shape. Results The CHR sample demonstrated a significant difference in overall magnitude of NP impairment but only a small and nearly significant difference in profile shape, primarily due to a large impairment in olfactory identification. Individuals who subsequently developed psychotic-level symptoms demonstrated large impairments in verbal IQ, verbal memory and olfactory identification comparable in magnitude to first episode samples. Conclusions CHR status may be associated with moderate generalized cognitive impairments marked by some degree of selective impairment in olfaction and verbal memory. Impairments were greatest in those who later developed psychotic symptoms. Future study of olfaction in CHR samples may enhance early detection and specification of neurodevelopmental mechanisms of risk. PMID:20692125

Background Lead is a heavy metal and important environmental toxicant and nerve poison that can destruction many functions of the nervous system. Lead poisoning is a medical condition caused by increased levels of lead in the body. Lead interferes with a variety of body processes and is toxic to many organs and issues, including the central nervous system. It interferes with the development of the nervous system, and is therefore particularly toxic to children, causing potentially permanent neural and cognitive impairments. In this study, we investigated the relationship between lead poisoning and the intellectual and neurobehavioral capabilities of children. Methods The background characteristics of the research subjects were collected by questionnaire survey. Blood lead levels were detected by differential potentiometric stripping analysis (DPSA). Intelligence was assessed using the Gesell Developmental Scale. The Achenbach Child Behavior Checklist (CBCL) was used to evaluate each child’s behavior. Results Blood lead levels were significantly negatively correlated with the developmental quotients of adaptive behavior, gross motor performance, fine motor performance, language development, and individual social behavior (P

Quality assurance in radiotherapy (RT) has been an integral aspect of cooperative group clinical trials since 1970. In early clinical trials, data acquisition was nonuniform and inconsistent and computational models for radiation dose calculation varied significantly. Process improvements developed for data acquisition, credentialing, and data management have provided the necessary infrastructure for uniform data. With continued improvement in the technology and delivery of RT, evaluation processes for target definition, RT planning, and execution undergo constant review. As we move to multimodality image-based definitions of target volumes for protocols, future clinical trials will require near real-time image analysis and feedback to field investigators. The ability of quality assurance centers to meet these real-time challenges with robust electronic interaction platforms for imaging acquisition, review, archiving, and quantitative review of volumetric RT plans will be the primary challenge for future successful clinical trials.

Computerized Clinical Decision Support (CDS) aims to aid decision making of health care providers and the public by providing easily accessible health-related information at the point and time it is needed. Natural Language Processing (NLP) is instrumental in using free-text information to drive CDS, representing clinical knowledge and CDS interventions in standardized formats, and leveraging clinical narrative. The early innovative NLP research of clinical narrative was followed by a period of stable research conducted at the major clinical centers and a shift of mainstream interest to biomedical NLP. This review primarily focuses on the recently renewed interest in development of fundamental NLP methods and advances in the NLP systems for CDS. The current solutions to challenges posed by distinct sublanguages, intended user groups, and support goals are discussed. PMID:19683066

We propose a methodological framework for evaluating clinical cognitive activities in complex real-world environments that provides a guiding framework for characterizing the patterns of activities. This approach, which we refer to as a process-based approach, is particularly relevant to cognitive informatics (CI) research-an interdisciplinary domain utilizing cognitive approaches in the study of computing systems and applications-as it provides new ways for understanding human information processing, interactions, and behaviors. Using this approach involves the identification of a process of interest (e.g., a clinical workflow), and the contributing sequences of activities in that process (e.g., medication ordering). A variety of analytical approaches can then be used to characterize the inherent dependencies and relations within the contributing activities within the considered process. Using examples drawn from our own research and the extant research literature, we describe the theoretical foundations of the process-based approach, relevant practical and pragmatic considerations for using such an approach, and a generic framework for applying this approach for evaluation studies in clinical settings. We also discuss the potential for this approach in future evaluations of interactive clinical systems, given the need for new approaches for evaluation, and significant opportunities for automated, unobtrusive data collection.

In this article, we describe the depth of knowledge and skill nurses used in making decisions regarding the safe processes and practices of medication administration. Using grounded theory, we identified the essence of medication safety by nurses as the theme of clinical reasoning. Nurses used two medication safety processes within the clinical reasoning theme-maintaining medication safety and managing the environment-together with six categories of patient-focused medication safety practices in the first process and four categories of environmental-focused safety practices within the second process. These processes and practices present an emerging model of safe medication administration developed from the narratives of 50 medical-surgical nurses. This model provides researchers with the basis for the development of systemic policies for safer medication administration for patients. Health care professional educators might also find the results useful in developing curricula focused on patient safety as the foundation of quality care.

Multichannel spectroscopy with millihertz resolution constitutes an attractive strategy for a microwave search for extraterrestrial intelligence (SETI), assuming the transmission of a narrow-band radiofrequency beacon. Such resolution matches the properties of the interstellar medium, and the necessary receiver Doppler corrections provide a high degree of interference rejection. We have constructed a frequency-agile swept receiver with an 8,388,608-channel spectrum analyzer, on-line signal recognition, and multithreshold archiving. A search of 250 Sun-like stars at 1.4 and 2.8 GHz has been carried out with the Arecibo 305-m antenna, and a meridian transit search of the northern sky is in progress at the Harvard-Smithsonian 26-m antenna. Successive spectra of 400 kHz at 0.05 Hz resolution are searched for features characteristic of an intentional narrowband beacon transmission. These spectra are centered on guessable ("magic") frequencies (such as the 21-cm hydrogen hyperfine line), referenced successively to the local standard of rest, the galactic barycenter, and the cosmic blackbody rest frame.

Clinicians can use the base rates of low scores in healthy people to reduce the likelihood of misdiagnosing cognitive impairment. In the present study, base rates were developed for the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) and Wechsler Memory Scale-Fourth Edition (WMS-IV) using 900 healthy adults and validated on 28 patients with moderate or severe traumatic brain injuries (TBIs). Results indicated that healthy people obtain some low scores on the WAIS-IV/WMS-IV, with prevalence rates increasing with fewer years of education and lower predicted intelligence. When applying the base rates information to the clinical sample, the TBI patients were 13 times more likely to be identified as having a low cognitive profile compared with the controls. Using the base rates information is a psychometrically advanced method for establishing criteria to determine low cognitive abilities on the WAIS-IV/WMS-IV.

The Partners Healthcare Research Patient Data Registry (RPDR) is a centralized data repository that gathers clinical data from various hospital systems. The RPDR allows clinical investigators to obtain aggregate numbers of patients with user-defined characteristics such as diagnoses, procedures, medications, and laboratory values. They may then obtain patient identifiers and electronic medical records with prior IRB approval. Moreover, the accurate identification and efficient population of worthwhile and quantifiable facts from doctor's report into the RPDR is a significant process. As part of our ongoing e-Fact project, this work describes a new business process management technology that helps coordinate and simplify this procedure.

Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinicalprocesses through AI methodologies yields positive results.

The overall objective of this project is to develop sensor-integrated ``intelligent`` diamond wheels for grinding of ceramics. Such wheels will be ``smart`` enough to monitor and supervise both the wheel preparation and grinding processes without the need to instrument the machine tool. Intelligent wheels will utilize reusable cores integrated with two types of sensors: acoustic emission (AE) and dynamic force transducers. Signals from the sensors will be transmitted from a rotating wheel to a receiver by telemetry. Wheels will be ``trained`` to recognize distinct characteristics associated with truing, dressing and grinding.

This is the third semi-annual report for the project. The overall objective of this project is to develop sensor-integrated intelligent diamond wheels for grinding of ceramics. Such wheels will be smart enough to monitor and supervise both the wheel preparation and grinding processes without the need to instrument the machine tool. Intelligent wheels will utilize re-useable cores integrated with sensors: to measure the acoustic emission (AE) and grinding force. Signals from the sensors will be transmitted from a rotating wheel to a receiver by telemetry. Wheels will be trained to recognize distinct characteristics associated with truing, dressing and grinding. The technical progress is summarized in this report.

Breast cancer is the most common type of cancer among women. The important key to treat the breast cancer is early detection of it because according to many pathological studies more than 75% – 80% of all abnormalities are still benign at primary stages; so in recent years, many studies and extensive research done to early detection of breast cancer with higher precision and accuracy. Infra-red breast thermography is an imaging technique based on recording temperature distribution patterns of breast tissue. Compared with breast mammography technique, thermography is more suitable technique because it is noninvasive, non-contact, passive and free ionizing radiation. In this paper, a full automatic high accuracy technique for classification of suspicious areas in thermogram images with the aim of assisting physicians in early detection of breast cancer has been presented. Proposed algorithm consists of four main steps: pre-processing & segmentation, feature extraction, feature selection and classification. At the first step, using full automatic operation, region of interest (ROI) determined and the quality of image improved. Using thresholding and edge detection techniques, both right and left breasts separated from each other. Then relative suspected areas become segmented and image matrix normalized due to the uniqueness of each person's body temperature. At feature extraction stage, 23 features, including statistical, morphological, frequency domain, histogram and Gray Level Co-occurrence Matrix (GLCM) based features are extracted from segmented right and left breast obtained from step 1. To achieve the best features, feature selection methods such as minimum Redundancy and Maximum Relevance (mRMR), Sequential Forward Selection (SFS), Sequential Backward Selection (SBS), Sequential Floating Forward Selection (SFFS), Sequential Floating Backward Selection (SFBS) and Genetic Algorithm (GA) have been used at step 3. Finally to classify and TH labeling procedures

Breast cancer is the most common type of cancer among women. The important key to treat the breast cancer is early detection of it because according to many pathological studies more than 75% - 80% of all abnormalities are still benign at primary stages; so in recent years, many studies and extensive research done to early detection of breast cancer with higher precision and accuracy. Infra-red breast thermography is an imaging technique based on recording temperature distribution patterns of breast tissue. Compared with breast mammography technique, thermography is more suitable technique because it is noninvasive, non-contact, passive and free ionizing radiation. In this paper, a full automatic high accuracy technique for classification of suspicious areas in thermogram images with the aim of assisting physicians in early detection of breast cancer has been presented. Proposed algorithm consists of four main steps: pre-processing & segmentation, feature extraction, feature selection and classification. At the first step, using full automatic operation, region of interest (ROI) determined and the quality of image improved. Using thresholding and edge detection techniques, both right and left breasts separated from each other. Then relative suspected areas become segmented and image matrix normalized due to the uniqueness of each person's body temperature. At feature extraction stage, 23 features, including statistical, morphological, frequency domain, histogram and Gray Level Co-occurrence Matrix (GLCM) based features are extracted from segmented right and left breast obtained from step 1. To achieve the best features, feature selection methods such as minimum Redundancy and Maximum Relevance (mRMR), Sequential Forward Selection (SFS), Sequential Backward Selection (SBS), Sequential Floating Forward Selection (SFFS), Sequential Floating Backward Selection (SFBS) and Genetic Algorithm (GA) have been used at step 3. Finally to classify and TH labeling procedures

There is a growing need to semantically process and integrate clinical data from different sources for clinical research. This paper presents an approach to integrate EHRs from heterogeneous resources and generate integrated data in different data formats or semantics to support various clinical research applications. The proposed approach builds semantic data virtualization layers on top of data sources, which generate data in the requested semantics or formats on demand. This approach avoids upfront dumping to and synchronizing of the data with various representations. Data from different EHR systems are first mapped to RDF data with source semantics, and then converted to representations with harmonized domain semantics where domain ontologies and terminologies are used to improve reusability. It is also possible to further convert data to application semantics and store the converted results in clinical research databases, e.g. i2b2, OMOP, to support different clinical research settings. Semantic conversions between different representations are explicitly expressed using N3 rules and executed by an N3 Reasoner (EYE), which can also generate proofs of the conversion processes. The solution presented in this paper has been applied to real-world applications that process large scale EHR data.

The migration of imaging reports to electronic medical record systems holds great potential in terms of advancing radiology research and practice by leveraging the large volume of data continuously being updated, integrated, and shared. However, there are significant challenges as well, largely due to the heterogeneity of how these data are formatted. Indeed, although there is movement toward structured reporting in radiology (ie, hierarchically itemized reporting with use of standardized terminology), the majority of radiology reports remain unstructured and use free-form language. To effectively "mine" these large datasets for hypothesis testing, a robust strategy for extracting the necessary information is needed. Manual extraction of information is a time-consuming and often unmanageable task. "Intelligent" search engines that instead rely on natural language processing (NLP), a computer-based approach to analyzing free-form text or speech, can be used to automate this data mining task. The overall goal of NLP is to translate natural human language into a structured format (ie, a fixed collection of elements), each with a standardized set of choices for its value, that is easily manipulated by computer programs to (among other things) order into subcategories or query for the presence or absence of a finding. The authors review the fundamentals of NLP and describe various techniques that constitute NLP in radiology, along with some key applications.

A common complaint of older listeners is that they can hear speech, yet cannot understand it, especially when listening to speech in a background noise. When target and competing speech signals are concurrently presented, a difference in the fundamental frequency (ΔF0) between competing speech signals, which determines the pitch of voice, can be an important and commonly occurring cue to facilitate the separation of the target message from the interfering message, consequently improving intelligibility of the target message. To address the question of whether the older listeners have reduced ability to use ΔF0 and how the age-related deficits in the processing of ΔF0 are theoretically explained, this paper is divided into three parts. The first part of this article summarizes how the speech-communication difficulties that older listeners have are theoretically explained. In the second part, literatures on the perceptual benefits from ΔF0 and the age-related deficits on the use of ΔF0 are reviewed. As a final part, three theoretical models explaining the general processing of ΔF0 are compared to discuss which better explains the age-related deficits in the processing of ΔF0. PMID:24653895

Clinical reasoning is a highly complex system with multiple inter-dependent mental activities. Gaining a better understanding of those cognitive processes has two practical implications: for physicians, being able to analyse their own reasoning method may prove to be helpful in diagnostic dead end; for medical teachers, identifying problem-solving strategies used by medical students may foster an appropriate individual feed-back aiming at improving their clinical reasoning skills. On the basis of a detailed literature review, the main diagnostic strategies and their related pattern of mental processes are described and illustrated with a concrete example, going from the patient's complaint to the chosen solution. Inductive, abductive and deductive diagnostic approaches are detailed. Different strategies for collecting data (exhaustive or oriented) and for problem-building are described. The place of problem solving strategies such as pattern-recognition, scheme inductive process, using of clinical script, syndrome grouping and mental hypotheses test is considered. This work aims at breaking up mental activities in process within clinical reasoning reminding that expert reasoning is characterised by the ability to use and structure the whole of these activities in a coherent system, using combined strategies in order to guarantee a better accuracy of their diagnosis.

There has been vast and growing amount of healthcare data especially with the rapid adoption of electronic health records (EHRs) as a result of the HITECH act of 2009. It is estimated that around 80% of the clinical information resides in the unstructured narrative of an EHR. Recently, natural language processing (NLP) techniques have offered…

Panic attacks are psychopathological phenomena with a strong emotional activation that often induces subsequent anticipatory anxiety and phobic avoidance. Impairment in emotional processing in patients with Panic Disorder (PD) has been hypothesized. Emotional Intelligence (EI) involves the individual abilities to perceive, understand and manage emotions in order to cope with changes in internal and external environment. We examined EI in 42 patients with PD with Agoraphobia compared to 49 healthy controls and investigated if clinical severity of Agoraphobia is related to EI performance. We assessed EI by Mayer-Salovey-Caruso Emotional Intelligence Test and Agoraphobia by Mobility Inventory for Agoraphobia. Patients with PD and Agoraphobia showed lower Strategic EI ability than healthy controls, in both Understanding and Managing emotion abilities, and a general propensity to attribute negative emotional valence to different stimuli. These preliminary results suggest that impaired mechanisms of understanding and integrating emotions may be involved in the phenomenology of PD. These features might be the target of psychological interventions in PD. On the contrary, Emotional Intelligence did not appear to affect the clinical severity of Agoraphobia.

The term "lean production," also known as "Lean," describes a process of operations management pioneered at the Toyota Motor Company that contributed significantly to the success of the company. Although developed by Toyota, the Lean process has been implemented at many other organizations, including those in health care, and should be considered by dental schools in evaluating their clinical operations. Lean combines engineering principles with operations management and improvement tools to optimize business and operating processes. One of the core concepts is relentless elimination of waste (non-value-added components of a process). Another key concept is utilization of individuals closest to the actual work to analyze and improve the process. When the medical center of the University of Kentucky adopted the Lean process for improving clinical operations, members of the College of Dentistry trained in the process applied the techniques to improve inefficient operations at the Walk-In Dental Clinic. The purpose of this project was to reduce patients' average in-the-door-to-out-the-door time from over four hours to three hours within 90 days. Achievement of this goal was realized by streamlining patient flow and strategically relocating key phases of the process. This initiative resulted in patient benefits such as shortening average in-the-door-to-out-the-door time by over an hour, improving satisfaction by 21%, and reducing negative comments by 24%, as well as providing opportunity to implement the electronic health record, improving teamwork, and enhancing educational experiences for students. These benefits were achieved while maintaining high-quality patient care with zero adverse outcomes during and two years following the process improvement project.

The Clinical Pharmacogenetics Implementation Consortium (CPIC) publishes genotype-based drug guidelines to help clinicians understand how available genetic test results could be used to optimize drug therapy. CPIC has focused initially on well-known examples of pharmacogenomic associations that have been implemented in selected clinical settings, publishing nine to date. Each CPIC guideline adheres to a standardized format and includes a standard system for grading levels of evidence linking genotypes to phenotypes and assigning a level of strength to each prescribing recommendation. CPIC guidelines contain the necessary information to help clinicians translate patient-specific diplotypes for each gene into clinical phenotypes or drug dosing groups. This paper reviews the development process of the CPIC guidelines and compares this process to the Institute of Medicine's Standards for Developing Trustworthy Clinical Practice Guidelines.

This issue of "Information Technology Quarterly" is devoted to the theme of "Artificial Intelligence." It contains two major articles: (1) Artificial Intelligence and Law" (D. Peter O'Neill and George D. Wood); (2) "Artificial Intelligence: A Long and Winding Road" (John J. Simon, Jr.). In addition, it contains two sidebars: (1) "Calculating and…

Purpose: Seeks to explore the notion of organisational intelligence as a simple extension of the notion of the idea of collective intelligence. Design/methodology/approach: Discusses organisational intelligence using previous research, which includes the Purpose, Properties and Practice model of Dealtry, and the Viable Systems model. Findings: The…

Overview of the artificial intelligence (AI) field provides a definition; discusses past research and areas of future research; describes the design, functions, and capabilities of expert systems and the "Turing Test" for machine intelligence; and lists additional sources for information on artificial intelligence. Languages of AI are…

Environmental Impact Assessment (EIA) is a decision-making process that often involves public participation in the scoping and reviewing stage. Although the importance of engaging the public in the EIA process has long been recognized, it is often considered ineffective due to factors such as time, budget, resource, technical and procedural…

We examined Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) General Ability Index (GAI) and Full Scale Intelligence Quotient (FSIQ) discrepancies in 100 epilepsy patients; 44% had a significant GAI > FSIQ discrepancy. GAI-FSIQ discrepancies were correlated with the number of antiepileptic drugs taken and duration of epilepsy. Individual antiepileptic drugs differentially interfere with the expression of underlying intellectual ability in this group. FSIQ may significantly underestimate levels of general intellectual ability in people with epilepsy. Inaccurate representations of FSIQ due to selective impairments in working memory and reduced processing speed obscure the contextual interpretation of performance on other neuropsychological tests, and subtle localizing and lateralizing signs may be missed as a result.

We analyzed the graphical representations that are used by various guideline-modeling methods to express process information embodied in clinical guidelines and protocols. From this analysis, we distilled four modeling formalisms and the processes they typically model: (1) flowcharts for capturing problem-solving processes, (2) disease-state maps that link decision points in managing patient problems over time, (3) plans that specify sequences of activities that contribute toward a goal, (4) workflow specifications that model care processes in an organization. We characterized the four approaches and showed that each captures some aspect of what a guideline may specify. We believe that a general guideline-modeling system must provide explicit representation for each type of process.

The creation of tools supporting the automatization of the standardization and continuous control of healthcare processes can become a significant helping tool for clinical experts and healthcare systems willing to reduce variability in clinical practice. The reduction in the complexity of design and deployment of standard Clinical Pathways can enhance the possibilities for effective usage of computer assisted guidance systems for professionals and assure the quality of the provided care. Several technologies have been used in the past for trying to support these activities but they have not been able to generate the disruptive change required to foster the general adoption of standardization in this domain due to the high volume of work, resources, and knowledge required to adequately create practical protocols that can be used in practice. This chapter proposes the use of the PALIA algorithm, based in Activity-Based process mining techniques, as a new technology to infer the actual processes from the real execution logs to be used in the design and quality control of healthcare processes.

Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

The Air Force has numerous on-going manufacturing and integration development programs (machine tools, composites, metals, assembly, and electronics) which are instrumental in improving productivity in the aerospace industry, but more importantly, have identified strategies and technologies required for the integration of advanced processing equipment. An introduction to four current Air Force Manufacturing Technology Directorate (ManTech) manufacturing areas is provided. Research is being carried out in the following areas: (1) machining initiatives for aerospace subcontractors which provide for advanced technology and innovative manufacturing strategies to increase the capabilities of small shops; (2) innovative approaches to advance machine tool products and manufacturing processes; (3) innovative approaches to advance sensors for process control in machine tools; and (4) efforts currently underway to develop, with the support of industry, the Next Generation Workstation/Machine Controller (Low-End Controller Task).

The capabilities of flight control systems can be enhanced by designing them to emulate functions of natural intelligence. Intelligent control functions fall in three categories. Declarative actions involve decision-making, providing models for system monitoring, goal planning, and system/scenario identification. Procedural actions concern skilled behavior and have parallels in guidance, navigation, and adaptation. Reflexive actions are spontaneous, inner-loop responses for control and estimation. Intelligent flight control systems learn knowledge of the aircraft and its mission and adapt to changes in the flight environment. Cognitive models form an efficient basis for integrating 'outer-loop/inner-loop' control functions and for developing robust parallel-processing algorithms.

This book presents the papers given at a conference on artificial intelligence and robot vision. Topics considered at the conference included pattern recognition, image processing for intelligent robotics, three-dimensional vision (depth and motion), vision modeling and shape estimation, spatial reasoning, the symbolic processing visual information, robotic sensors and applications, intelligent control architectures for robot systems, robot languages and programming, human-machine interfaces, robotics applications, and architectures of robotics.

Speed of information processing, as measured by inspection time (IT), is a robust predictor of intellectual functioning. However, among individuals with autism and low IQ scores, IT has been reported to be discrepantly fast, and equal to that of high IQ typically developing children (Scheuffgen et al. in "Dev Psychopathol" 12: 83-90, 2000). The…

Modified 9Cr-1Mo ferritic steel is used as a structural material for steam generator components of power plants. Generally, tungsten inert gas (TIG) welding is preferred for welding of these steels in which the depth of penetration achievable during autogenous welding is limited. Therefore, activated flux TIG (A-TIG) welding, a novel welding technique, has been developed in-house to increase the depth of penetration. In modified 9Cr-1Mo steel joints produced by the A-TIG welding process, weld bead width, depth of penetration, and heat-affected zone (HAZ) width play an important role in determining the mechanical properties as well as the performance of the weld joints during service. To obtain the desired weld bead geometry and HAZ width, it becomes important to set the welding process parameters. In this work, adaptative neuro fuzzy inference system is used to develop independent models correlating the welding process parameters like current, voltage, and torch speed with weld bead shape parameters like depth of penetration, bead width, and HAZ width. Then a genetic algorithm is employed to determine the optimum A-TIG welding process parameters to obtain the desired weld bead shape parameters and HAZ width.

Research suggests that select processing speed measures can also serve as embedded validity indicators (EVIs). The present study examined the diagnostic utility of Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) subtests as EVIs in a mixed clinical sample of 205 patients medically referred for neuropsychological assessment (53.3% female, mean age = 45.1). Classification accuracy was calculated against 3 composite measures of performance validity as criterion variables. A PSI ≤79 produced a good combination of sensitivity (.23-.56) and specificity (.92-.98). A Coding scaled score ≤5 resulted in good specificity (.94-1.00), but low and variable sensitivity (.04-.28). A Symbol Search scaled score ≤6 achieved a good balance between sensitivity (.38-.64) and specificity (.88-.93). A Coding-Symbol Search scaled score difference ≥5 produced adequate specificity (.89-.91) but consistently low sensitivity (.08-.12). A 2-tailed cutoff on the Coding/Symbol Search raw score ratio (≤1.41 or ≥3.57) produced acceptable specificity (.87-.93), but low sensitivity (.15-.24). Failing ≥2 of these EVIs produced variable specificity (.81-.93) and sensitivity (.31-.59). Failing ≥3 of these EVIs stabilized specificity (.89-.94) at a small cost to sensitivity (.23-.53). Results suggest that processing speed based EVIs have the potential to provide a cost-effective and expedient method for evaluating the validity of cognitive data. Given their generally low and variable sensitivity, however, they should not be used in isolation to determine the credibility of a given response set. They also produced unacceptably high rates of false positive errors in patients with moderate-to-severe head injury. Combining evidence from multiple EVIs has the potential to improve overall classification accuracy. (PsycINFO Database Record

There is an increasing need to provide end-users with seamless and secure access to healthcare information acquired from a diverse range of sources. This might include local and remote hospital sites equipped with different vendors and practicing varied acquisition protocols and also heterogeneous external sources such as the Internet cloud. In such scenarios, image post-processing tools such as CAD (computer-aided diagnosis) which were hitherto developed using a smaller set of images may not always work optimally on newer set of images having entirely different characteristics. In this paper, we propose a framework that assesses the quality of a given input image and automatically applies an appropriate pre-processing method in such a manner that the image characteristics are normalized regardless of its source. We focus mainly on medical images, and the objective of the said preprocessing method is to standardize the performance of various image processing and workflow applications like CAD to perform in a consistent manner. First, our system consists of an assessment step wherein an image is evaluated based on criteria such as noise, image sharpness, etc. Depending on the measured characteristic, we then apply an appropriate normalization technique thus giving way to our overall pre-processing framework. A systematic evaluation of the proposed scheme is carried out on large set of CT images acquired from various vendors including images reconstructed with next generation iterative methods. Results demonstrate that the images are normalized and thus suitable for an existing LungCAD prototype1.

Imagine a situation in which you had to design a physical agent that could collect information from its environment, then store and process that information to help it respond appropriately to novel situations. What kinds of information should it attend to? How should the information be represented so as to allow efficient use and re-use? What kinds of constraints and trade-offs would there be? There are no unique answers. In this paper, we discuss some of the ways in which the need to be able to address problems of varying kinds and complexity can be met by different information processing systems. We also discuss different ways in which relevant information can be obtained, and how different kinds of information can be processed and used, by both biological organisms and artificial agents. We analyse several constraints and design features, and show how they relate both to biological organisms, and to lessons that can be learned from building artificial systems. Our standpoint overlaps with Karmiloff-Smith (1992) in that we assume that a collection of mechanisms geared to learning and developing in biological environments are available in forms that constrain, but do not determine, what can or will be learnt by individuals.

An individual's self-reported abilities to attend to, understand, and reinterpret emotional situations or events have been associated with anxiety and depression, but it is unclear how these abilities affect the processing of emotional stimuli, especially in individuals with these symptoms. The present study recorded event-related brain potentials while individuals reporting features of anxiety and depression completed an emotion-word Stroop task. Results indicated that anxious apprehension, anxious arousal, and depression were associated with self-reported emotion abilities, consistent with prior literature. In addition, lower anxious apprehension and greater reported emotional clarity were related to slower processing of negative stimuli indexed by event-related potentials (ERPs). Higher anxious arousal and reported attention to emotion were associated with ERP evidence of early attention to all stimuli regardless of emotional content. Reduced later engagement with stimuli was also associated with anxious arousal and with clarity of emotions. Depression was not differentially associated with any emotion processing stage indexed by ERPs. Research in this area may lead to the development of therapies that focus on minimization of anxiety to foster successful emotion regulation.

The trends and roles of VSAT services in the year 2010 time frame are examined based on an overall network and service model for that period. An estimate of the VSAT traffic is then made and the service and general network requirements are identified. In order to accommodate these traffic needs, four satellite VSAT architectures based on the use of fixed or scanning multibeam antennas in conjunction with IF switching or onboard regeneration and baseband processing are suggested. The performance of each of these architectures is assessed and the key enabling technologies are identified.

Recent research has explored the role of metacognitive beliefs and processes in clinical anxiety in youth. The aim of this study was to examine the relationship between metacognitions and anxiety in 7- to 12-year-old children with and without clinical anxiety disorders. A secondary aim of the study was to investigate the psychometric properties of the recently developed Metacognitions Questionnaire for Children (MCQ-C). The sample consisted of 83 children (60.2% female; Oceanian 71.1%), comprising 49 children with anxiety disorders and 34 nonclinical children. All children completed self-report measures of anxiety, emotional difficulties, and metacognitions. A subsample of 7- to 8-year-old participants was used to explore whether young children could wholly comprehend all items on the MCQ-C. Positive and negative metacognitive beliefs and cognitive monitoring were significantly correlated with anxiety and emotional difficulties. Clinical children endorsed significantly more negative and more positive metacognitive beliefs than nonclinical children. Each subscale of the MCQ-C had poor internal consistency. Support for the criterion and convergent validity of the MCQ-C was found. The results suggest that certain metacognitions play a role in clinical anxiety in children but that psychometrically and developmentally validated measures of these concepts in younger individuals are needed.

Locating the assignable causes by use of the abnormal patterns of control chart is a widely used technology for manufacturing quality control. If there are uncertainties about the occurrence degree of abnormal patterns, the diagnosis process is impossible to be carried out. Considering four common abnormal control chart patterns, this paper proposed a characteristic numbers based recognition method point by point to quantify the occurrence degree of abnormal patterns under uncertain conditions and a fuzzy inference system based on fuzzy logic to calculate the contribution degree of assignable causes with fuzzy abnormal patterns. Application case results show that the proposed approach can give a ranked causes list under fuzzy control chart abnormal patterns and support the abnormity eliminating.

Locating the assignable causes by use of the abnormal patterns of control chart is a widely used technology for manufacturing quality control. If there are uncertainties about the occurrence degree of abnormal patterns, the diagnosis process is impossible to be carried out. Considering four common abnormal control chart patterns, this paper proposed a characteristic numbers based recognition method point by point to quantify the occurrence degree of abnormal patterns under uncertain conditions and a fuzzy inference system based on fuzzy logic to calculate the contribution degree of assignable causes with fuzzy abnormal patterns. Application case results show that the proposed approach can give a ranked causes list under fuzzy control chart abnormal patterns and support the abnormity eliminating. PMID:28058046

OBJECTIVE: Develop a representation of clinical observations and actions and a method of processing free-text patient documents to facilitate applications such as quality assurance. DESIGN: The Linguistic String Project (LSP) system of New York University utilizes syntactic analysis, augmented by a sublanguage grammar and an information structure that are specific to the clinical narrative, to map free-text documents into a database for querying. MEASUREMENTS: Information precision (I-P) and information recall (I-R) were measured for queries for the presence of 13 asthma-health-care quality assurance criteria in a database generated from 59 discharge letters. RESULTS: I-P, using counts of major errors only, was 95.7% for the 28-letter training set and 98.6% for the 31-letter test set. I-R, using counts of major omissions only, was 93.9% for the training set and 92.5% for the test set. PMID:7719796

Prior to this project, training information for the employees of the National Center for Critical Processing and Storage (NCCIPS) was stored in an array of unrelated spreadsheets and SharePoint lists that had to be manually updated. By developing a content management system through a web application platform named SharePoint, this training system is now highly automated and provides a much less intensive method of storing training data and scheduling training courses. This system was developed by using SharePoint Designer and laying out the data structure for the interaction between different lists of data about the employees. The automation of data population inside of the lists was accomplished by implementing SharePoint workflows which essentially lay out the logic for how data is connected and calculated between certain lists. The resulting training system is constructed from a combination of five lists of data with a single list acting as the user-friendly interface. This interface is populated with the courses required for each employee and includes past and future information about course requirements. The employees of NCCIPS now have the ability to view, log, and schedule their training information and courses with much more ease. This system will relieve a significant amount of manual input and serve as a powerful informational resource for the employees of NCCIPS in the future.

reactive gas in a given reaction mixture. We have developed a tube-in-tube reactor device consisting of a pair of concentric capillaries in which pressurized gas permeates through an inner Teflon AF-2400 tube and reacts with dissolved substrate within a liquid phase that flows within a second gas impermeable tube. This Account examines our efforts toward the development of a simple, unified methodology for the processing of gaseous reagents in flow by way of development of a tube-in-tube reactor device and applications to key C-C, C-N, and C-O bond forming and hydrogenation reactions. We further describe the application to multistep reactions using solid-supported reagents and extend the technology to processes utilizing multiple gas reagents. A key feature of our work is the development of computer-aided imaging techniques to allow automated in-line monitoring of gas concentration and stoichiometry in real time. We anticipate that this Account will illustrate the convenience and benefits of membrane tube-in-tube reactor technology to improve and concomitantly broaden the scope of gas/liquid/solid reactions in organic synthesis.

An introductory discussion of the related concepts of intelligence and consciousness suggests criteria to be met in the modeling of intelligence and the development of intelligent materials. Methods for the modeling of actual structure and activity of the animal cortex have been found, based on present knowledge of the ionic and cellular constitution of the nervous system. These have led to the development of a realistic neural network model, which has been used to study the formation of memory and the process of learning. An account is given of experiments with simple materials which exhibit almost all properties of biological synapses and suggest the possibility of a new type of computer architecture to implement an advanced type of artificial intelligence.

Microprocessor support hardware, software, and cross assemblers relating to the Motorola 6800 and 6809 process systems were developed. Pinter controller and intelligent CRT development are discussed. The user's manual, design specifications for the MC6809 version of the intelligent printer controller card, and a 132-character by 64-line intelligent CRT display system using a Motorola 6809 MPU, and a one-line assembler and disassembler are provided.

Introduction: Optimal intelligence is a vital essentiality in day-to-day life, especially in children who have to build up their life in an apt manner. Yashtimadhu (Glycyrrhiza glabra Linn) is a time tested classical drug indicated for promotion of mental health mentioned in Ayurveda which may also help children to attain optimal intelligence. Aim: To evaluate the role of Yashtimadhu (Glycyrrhiza glabra Linn.) granules in enhancement of Medha (intelligence quotient [IQ]). Materials and Methods: The study was conducted on healthy school going children aged 14–16 years. Total 94 children were registered and divided into two groups. Yashtimadhu granules was administered in Group A and Wheat flour in the form of granules in Group B, the duration of treatment was 12 weeks with follow up of additional 12 weeks. Objective parameters included assessment of functional aspects of Buddhi (psychological faculty for reasoning and logic) along with the assessment of IQ, Quality of life parameters and general health condition. Results: Yashtimadhu granules showed statistically highly significant results in improving functional aspects of Buddhi, IQ, several aspects of quality of life parameters and health. The number needed to treat (NNT) with Yashtimadhu granules for children achieving an IQ score of 90 and above was 3.38, suggesting one in every 3.38 patients had achieved this target and for children achieving an IQ score of 110 and above the NNT was 6.66. Conclusion: Yashtimadhu granules was safe throughout the course of study and indeed possessed a significant efficacy in improving Medha (IQ). PMID:26730140

Introduction Clinical governance as an approach to improving the quality and safety of clinical care has been run in all Iranian hospitals since 2009. This study aimed to provide a comprehensive overview of the processes and challenges faced in implementing clinical governance (CG) in acute-care hospitals in Iran. Methods We conducted an in-depth, qualitative, multi-case study using semi-structured interviews with a range of key stakeholders and review of relevant documents. This study was conducted in 2011–2012 in six governmental hospitals affiliated with Tehran University of Medical Sciences. The data were analyzed using framework analysis. Results The interviewees, predominantly senior managers and nurses, expressed generally positive attitudes towards the benefits of CG. Four out of the six hospitals had a formal strategic plan to implement and execute CG. The emergent barriers to the implementation of CG included insufficient resources, the absence of clear supporting structures, a lack of supportive cultures, and inadequate support from senior management. The main facilitating factors were the reverse of the barriers noted above in addition to developing good relationships with key stakeholders, raising the awareness of CG among staff, and well-designed incentives. Conclusions There is a positive sense towards CG, but its successful implementation in Iran will require raising the awareness of CG among staff and key stakeholders and the successful collaboration of internal staff and external agencies. PMID:26952249

The first step in redesigning the health care delivery process for ambulatory care begins with the patient and the business processes that support the patient. Patient-related business processes include patient access, service documentation, billing, follow-up, collection, and payment. Access is the portal to the clinical delivery and care management process. Service documentation, charge capture, and payment and collection are supporting processes to care delivery. Realigned provider networks now demand realigned patient business services to provide their members/customers/patients with improved service delivery at less cost. Purchaser mandates for cost containment, health maintenance, and enhanced quality of care have created an environment where every aspect of the delivery system, especially ambulatory care, is being judged. Business processes supporting the outpatient are therefore being reexamined for better efficiency and customer satisfaction. Many health care systems have made major investments in their ambulatory care environment, but have pursued traditional supporting business practices--such as multiple access points, lack of integrated patient appointment scheduling and registration, and multiple patient bills. These are areas that are appropriate for redesign efforts--all with the customer's needs and convenience in mind. Similarly, setting unrealistic expectations, underestimating the effort required, and ignoring the human elements of a patient-focused business service redesign effort can sabotage the very sound reasons for executing such an endeavor. Pitfalls can be avoided if a structured methodology, coupled with a change management process, are employed. Deloitte & Touche Consulting Group has been involved in several major efforts, all with ambulatory care settings to assist with the redesign of their business practices to consider the patient as the driver, instead of the institution providing the care.

Howard Gardner's theory of Multiple Intelligences has had a huge influence on school education. But its credentials lack justification, as the first section of this paper shows via a detailed philosophical analysis of how the intelligences are identified. If we want to make sense of the theory, we need to turn from a philosophical to a historical…

In 2002 Helena Kraemer and colleagues published an important article on the analysis of clinical trials in mental health, which advocated a planned focus on mechanisms to investigate the processes behind treatment effects. Kraemer et al. considered not only new approaches to mediation analysis, but also a theoretical approach to factors, both pre-treatment and during treatment, that might moderate this mediation. Trials should not just be about whether a treatment 'worked', but how it worked; with the results informing modification of the intervention for the next trial by discarding aspects that were not effective and reinforcing aspects that were - an iterative procedure towards greater effectiveness. Can we enjoy similar ambitions for complex interventions within mental health? It is not so long ago when the received wisdom within the clinical and much of the research community was that it was simply impossible in practice to mount randomised controlled trials relevant to the kind of psychosocial interventions we use in child and adolescent mental health (CAMHS). How different the situation is now, with burgeoning interest in a systematic evidence base for psychological treatment and the possibilities for unexpected advances (as well as unexpected harms). Nevertheless it is probably still fair to say that the systematic use of process and mechanism study within trials in our field is the exception rather than the rule. What are the possibilities and implications for our field?

Introduction: The rapid growth of the Internet has modified the boundaries of information acquisition (tracking) in environmental scanning. Despite the numerous advantages of this new medium, information overload is an enormous problem for Internet scanners. In order to help them, intelligent agents (i.e., autonomous, automated software agents…

To explore the possible neural foundations of individual differences in intelligence test scores, we examined the associations between Raven's Matrices scores and two tasks that were administered in a functional magnetic resonance imaging (fMRI) setting. The two tasks were an n-back working memory (N = 37) task and inspection time (N = 47). The…

Recent advances in information technology have made feasible the development of intelligent computer assisted instruction study units. Analysis and design of such systems require the active involvement of a development team consisting of domain experts, educators, and knowledge engineers. Science and technology teaching methods are frequently…

Electronic clinical documentation can be useful for activities such as public health surveillance, quality improvement, and research, but existing methods of de-identification may not provide sufficient protection of patient data. The general-purpose natural language processor MedLEE retains medical concepts while excluding the remaining text so, in addition to processing text into structured data, it may be able provide a secondary benefit of de-identification. Without modifying the system, the authors tested the ability of MedLEE to remove protected health information (PHI) by comparing 100 outpatient clinical notes with the corresponding XML-tagged output. Of 809 instances of PHI, 26 (3.2%) were detected in output as a result of processing and identification errors. However, PHI in the output was highly transformed, much appearing as normalized terms for medical concepts, potentially making re-identification more difficult. The MedLEE processor may be a good enhancement to other de-identification systems, both removing PHI and providing coded data from clinical text.

Examines the functional characteristics of intelligent computer assisted instruction (ICAI) and discusses the requirements of a multidisciplinary cooperative effort of its development. A typical ICAI model is presented and intelligent features of ICAI systems are described, including modeling the student's learning process, qualitative decision…

''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. The project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.

The overall objective of this project is to develop sensor-integrated ``intelligent`` diamond wheels for grinding of ceramics. Such wheels will be ``smart`` enough to monitor and supervise both the wheel preparation and grinding processes without the need to instrument the machine tool. Intelligent wheels will utilize re-useable cores integrated with sensors: to measure the acoustic emission (AE) and grinding force. Signals from the sensors will be transmitted from a rotating wheel to a receiver by telemetry. Wheels will be ``trained`` to recognize distinct characteristics associated with truing, dressing and grinding. This overall project is divided into six tasks as follows: (1) development of miniaturized sensors and data transmission system; (2) wheel design and sensor configuration; (3) calibration of the sensor integrated wheel; (4) training of the intelligent wheel; (5) grinding tests; and (6) prototype demonstration. The technical progress is summarized in this report according to the tasks. All activity during this period has been concerned with the first two interrelated tasks, which need to be completed before undertaking the remaining tasks.

The Intelligent Potroom Operation project focuses on maximizing the performance of an aluminum smelter by innovating components for an intelligent manufacturing system. The Intelligent Potroom Advisor (IPA) monitors process data to identify reduction cells exhibiting behaviors that require immediate attention. It then advises operational personnel on those heuristic-based actions to bring the cell back to an optimal operating state in order to reduce the duration and frequency of substandard reduction cell performance referred to as ''Off-Peak Modes'' (OPMs). Techniques developed to identify cells exhibiting OPMs include the use of a finite element model-based cell state estimator for defining the cell's current operating state via advanced cell noise analyses. In addition, rule induction was also employed to identify statistically significant complex behaviors that occur prior to OPMs. The intelligent manufacturing system design, concepts and formalisms developed in this project w ere used as a basis for an intelligent manufacturing system design. Future research will incorporate an adaptive component to automate continuous process improvement, a technology platform with the potential to improve process performance in many of the other Industries of the Future applications as well.

The three papers in this volume concerning artificial intelligence and language comprehension were commissioned by the National Institute of Education to further the understanding of the cognitive processes that enable people to comprehend what they read. The first paper, "Artificial Intelligence and Language Comprehension," by Terry Winograd,…

Defines artificial intelligence and reviews current research in natural language processing, expert systems, and robotics and sensory systems. Discussion covers current commercial applications of artificial intelligence and projections of uses and limitations in library technical and public services, e.g., in cataloging and online information and…

Based upon Gardner's theory of multiple intelligences, this book guides elementary school teachers through the process of using classroom learning centers and projects by providing choices for students. The guide is divided into two sections, providing the theoretical background and information on how to develop multiple intelligences learning…

... Process for Clinical Research Training and Medical Education at the Clinical Center and Its Impact on... project, contact: Robert M. Lembo, MD, Deputy Director, Office of Clinical Research Training and Medical... days of the date of this publication. Proposed Collection: Application Process for Clinical...

... for OMB Review; 30-Day Comment Request: Application Process for Clinical Research Training and Medical... contact: Robert M. Lembo, MD, Deputy Director, Office of Clinical Research Training and Medical Education... Process for Clinical Research Training and Medical Education at the Clinical Center and its Impact...

Responses to items from an intelligence test may be fast or slow. The research issue dealt with in this paper is whether the intelligence involved in fast correct responses differs in nature from the intelligence involved in slow correct responses. There are two questions related to this issue: 1. Are the processes involved different? 2. Are the…

Perfectionism is a risk and maintaining factor for eating disorders, anxiety disorders and depression. The objective of this paper is to review the four bodies of evidence supporting the notion that perfectionism is a transdiagnostic process. First, a review of the literature was conducted that demonstrates the elevation of perfectionism across numerous anxiety disorders, depression, and eating disorders compared to healthy controls. Data is presented that shows perfectionism increases vulnerability for eating disorders, and that it maintains obsessive-compulsive disorder, social anxiety and depression as it predicts treatment outcome in these disorders. Second, evidence is examined showing that elevated perfectionism is associated with co-occurrence of psychopathology. Third, the different conceptualisations of perfectionism are reviewed, including a cognitive-behavioural conceptualisation of clinical perfectionism that can be utilised to understand this transdiagnostic process. Fourth, evidence that treatment of perfectionism results in reductions in anxiety, depression and eating pathology is reviewed. Finally, the importance of clinicians considering the routine assessment and treatment of perfectionism is outlined.

The maturing of technologies in computer capabilities, particularly direct digital signals, has provided an exciting variety of new communication and facility control opportunities. These include telecommunications, energy management systems, security systems, office automation systems, local area networks, and video conferencing. New applications are developing continuously. The so-called "intelligent" or "smart" building concept evolves from the development of this advanced technology in building environments. Automation has had a dramatic effect on facility planning. For decades, communications were limited to the telephone, the typewritten message, and copy machines. The office itself and its functions had been essentially unchanged for decades. Office automation systems began to surface during the energy crisis and, although their newer technology was timely, they were, for the most part, designed separately from other new building systems. For example, most mainframe computer systems were originally stand-alone, as were word processing installations. In the last five years, the advances in distributive systems, networking, and personal computer capabilities have provided opportunities to make such dramatic improvements in productivity that the Selectric typewriter has gone from being the most advanced piece of office equipment to nearly total obsolescence.

The book covers the principles of AI, the main areas of application, as well as considering some of the social implications. The applications chapters have a common format structured as follows: definition of the topic; approach with conventional computing techniques; why 'intelligence' would provide a better approach; and how AI techniques would be used and the limitations. The contents discussed are: Principles of artificial intelligence; AI programming environments; LISP, list processing and pattern-making; AI programming with POP-11; Computer processing of natural language; Speech synthesis and recognition; Computer vision; Artificial intelligence and robotics; The anatomy of expert systems - Forsyth; Machine learning; Memory models of man and machine; Artificial intelligence and cognitive psychology; Breaking out of the chinese room; Social implications of artificial intelligence; and Index.

Takagi-Sugeno (T-S) fuzzy neural networks (FNNs) can be used to handle complex, fuzzy, uncertain clinical pathway (CP) variances. However, there are many drawbacks, such as slow training rate, propensity to become trapped in a local minimum and poor ability to perform a global search. In order to improve overall performance of variance handling by T-S FNNs, a new CP variance handling method is proposed in this study. It is based on random cooperative decomposing particle swarm optimization with double mutation mechanism (RCDPSO_DM) for T-S FNNs. Moreover, the proposed integrated learning algorithm, combining the RCDPSO_DM algorithm with a Kalman filtering algorithm, is applied to optimize antecedent and consequent parameters of constructed T-S FNNs. Then, a multi-swarm cooperative immigrating particle swarm algorithm ensemble method is used for intelligent ensemble T-S FNNs with RCDPSO_DM optimization to further improve stability and accuracy of CP variance handling. Finally, two case studies on liver and kidney poisoning variances in osteosarcoma preoperative chemotherapy are used to validate the proposed method. The result demonstrates that intelligent ensemble T-S FNNs based on the RCDPSO_DM achieves superior performances, in terms of stability, efficiency, precision and generalizability, over PSO ensemble of all T-S FNNs with RCDPSO_DM optimization, single T-S FNNs with RCDPSO_DM optimization, standard T-S FNNs, standard Mamdani FNNs and T-S FNNs based on other algorithms (cooperative particle swarm optimization and particle swarm optimization) for CP variance handling. Therefore, it makes CP variance handling more effective.

The problem of designing 'intelligent machines' to operate in uncertain environments with minimum supervision or interaction with a human operator is examined. The structure of an 'intelligent machine' is defined to be the structure of a Hierarchically Intelligent Control System, composed of three levels hierarchically ordered according to the principle of 'increasing precision with decreasing intelligence', namely: the organizational level, performing general information processing tasks in association with a long-term memory; the coordination level, dealing with specific information processing tasks with a short-term memory; and the control level, which performs the execution of various tasks through hardware using feedback control methods. The behavior of such a machine may be managed by controls with special considerations and its 'intelligence' is directly related to the derivation of a compatible measure that associates the intelligence of the higher levels with the concept of entropy, which is a sufficient analytic measure that unifies the treatment of all the levels of an 'intelligent machine' as the mathematical problem of finding the right sequence of internal decisions and controls for a system structured in the order of intelligence and inverse order of precision such that it minimizes its total entropy. A case study on the automatic maintenance of a nuclear plant illustrates the proposed approach.

Emotional abilities were measured with a performance test of emotional intelligence (The Mayer-Salovey-Caruso Emotional Intelligence Test; Mayer, Salovey, & Caruso, 2002) in patients diagnosed with major depressive disorder, substance abuse disorder, or borderline personality disorder (BPD), and a nonclinical control group. Findings showed that all clinical groups differed from controls with respect to their overall emotional intelligence score, which dovetails with previous findings from self-report measures. Specifically, we found that the ability to understand emotional information and the ability to regulate emotions best distinguished the groups. Findings showed that patients with substance abuse disorder and BPD patients were most impaired.

Identifying disease-modifying treatment effects in earlier stages of Alzheimer's disease (AD)-when changes are subtle-will require improved trial design and more sensitive analytical methods. We applied hierarchical Bayesian analysis with cognitive processing (HBCP) models to the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) and MCI (mild cognitive impairment) Screen word list memory task data from 14 Alzheimer's disease AD patients of the Myriad Pharmaceuticals' phase III clinical trial of Flurizan (a γ-secretase modulator) versus placebo. The original analysis of 1649 patients found no treatment group differences. HBCP analysis and the original ADAS-Cog analysis were performed on the small sample. HBCP analysis detected impaired memory storage during delayed recall, whereas the original ADAS-Cog analytical method did not. The HBCP model identified a harmful treatment effect in a small sample, which has been independently confirmed from the results of other γ-secretase inhibitor. The original analytical method applied to the ADAS-Cog data did not detect this harmful treatment effect on either the full or the small sample. These findings suggest that HBCP models can detect treatment effects more sensitively than currently used analytical methods required by the Food and Drug Administration, and they do so using small patient samples.

Artificial intelligence (AI) is a computer based science which aims to simulate human brain faculties using a computational system. A brief history of this new science goes from the creation of the first artificial neuron in 1943 to the first artificial neural network application to genetic algorithms. The potential for a similar technology in medicine has immediately been identified by scientists and researchers. The possibility to store and process all medical knowledge has made this technology very attractive to assist or even surpass clinicians in reaching a diagnosis. Applications of AI in medicine include devices applied to clinical diagnosis in neurology and cardiopulmonary diseases, as well as the use of expert or knowledge-based systems in routine clinical use for diagnosis, therapeutic management and for prognostic evaluation. Biological applications include genome sequencing or DNA gene expression microarrays, modeling gene networks, analysis and clustering of gene expression data, pattern recognition in DNA and proteins, protein structure prediction. In the field of hematology the first devices based on AI have been applied to the routine laboratory data management. New tools concern the differential diagnosis in specific diseases such as anemias, thalassemias and leukemias, based on neural networks trained with data from peripheral blood analysis. A revolution in cancer diagnosis, including the diagnosis of hematological malignancies, has been the introduction of the first microarray based and bioinformatic approach for molecular diagnosis: a systematic approach based on the monitoring of simultaneous expression of thousands of genes using DNA microarray, independently of previous biological knowledge, analysed using AI devices. Using gene profiling, the traditional diagnostic pathways move from clinical to molecular based diagnostic systems.

The potential application of artificial intelligence (AI) to computer-assisted language learning (CALL) is explored. Two areas of AI that hold particular interest to those who deal with language meaning--knowledge representation and expert systems, and natural-language processing--are described and examples of each are presented. AI contribution…

This paper provides a brief historical introduction to the new field of artificial intelligence and describes some applications to psychiatry. It focuses on two successful programs: a model of paranoid processes and an expert system for the pharmacological management of depressive disorders. Finally, it reviews evidence in favor of computerized psychotherapy and offers speculations on the future development of research in this area.

Intelligent systems have emerged in our biosphere in different contexts and achieving different levels of complexity. The requirement of communication in a social context has been in all cases a determinant. The human brain, probably co-evolving with language, is an exceedingly successful example. Similarly, social insects complex collective decisions emerge from information exchanges between many agents. The difference is that such processing is obtained out of a limited individual cognitive power. Computational models and embodied versions using non-living systems, particularly involving robot swarms, have been used to explore the potentiality of collective intelligence. Here we suggest a novel approach to the problem grounded in the genetic engineering of unicellular systems, which can be modified in order to interact, store memories or adapt to external stimuli in collective ways. What we label as Synthetic Swarm Intelligence defines a parallel approach to the evolution of computation and swarm intelligence and allows to explore potential embodied scenarios for decision making at the microscale. Here, we consider several relevant examples of collective intelligence and their synthetic organism counterparts.

In order for information to be stored and processed in a computer, it must be reduced to data and organized and systematized in accordance with the rules and principles of formal logic. Reducing manifold reality to data for use by the computer results in loss of information because an arbitrary screening of data eliminates that gathered by the…

In order to assess the possible clinical value of measuring regional amplitude of ultrasound reflected from intracardiac structures, two-dimensional echocardiographic images from 20 normal subjects and 70 patients with heart disease were processed by modulation of both colour and intensity to represent grey scale. Maximum echo intensity was consistently recorded from the pericardial interface behind the posterior left ventricular wall, and this was taken as 100 per cent. In the normal heart, central fibrous body reflected at 64 +/- 5 per cent, and mitral and aortic valves at 35 +/- 5 per cent and 36 +/- 8 per cent, respectively. Normal septal myocardium gave a valve of 33 +/- 8 per cent and posterior wall of 23 +/- 6 per cent. Consistent increases were recorded from prosthetic mitral valves in 20 patients, and also from the anterior cusp of the mitral valve (54 +/- 11%) in 20 patients with rheumatic heart diseases. In all of 15 patients with left ventricular involvement caused by coronary artery disease, septal echo amplitude was increased in all to 71 +/- 11 per cent, and localised increases were also noted in the posterior wall, subendocardially, or in one or both papillary muscles. Similar focal changes were noted in five, and increases in septal density in 10 of 15 patients with left ventricular hypertrophy and in 12 of those with mitral valve disease. Thus, measurement of regional echo amplitude is possible without degradation of the quality from standard two-dimensional cardiac images. Abnormalities are particularly common in left ventricular disease where their distribution corresponds to that described for fibrous tissue. Images PMID:6455136

Background Accurately monitoring and collecting drug adherence data can allow for better understanding and interpretation of the outcomes of clinical trials. Most clinical trials use a combination of pill counts and self-reported data to measure drug adherence, despite the drawbacks of relying on these types of indirect measures. It is assumed that doses are taken, but the exact timing of these events is often incomplete and imprecise. Objective The objective of this pilot study was to evaluate the use of a novel artificial intelligence (AI) platform (AiCure) on mobile devices for measuring medication adherence, compared with modified directly observed therapy (mDOT) in a substudy of a Phase 2 trial of the α7 nicotinic receptor agonist (ABT-126) in subjects with schizophrenia. Methods AI platform generated adherence measures were compared with adherence inferred from drug concentration measurements. Results The mean cumulative pharmacokinetic adherence over 24 weeks was 89.7% (standard deviation [SD] 24.92) for subjects receiving ABT-126 who were monitored using the AI platform, compared with 71.9% (SD 39.81) for subjects receiving ABT-126 who were monitored by mDOT. The difference was 17.9% (95% CI -2 to 37.7; P=.08). Conclusions Using drug levels, this substudy demonstrates the potential of AI platforms to increase adherence, rapidly detect nonadherence, and predict future nonadherence. Subjects monitored using the AI platform demonstrated a percentage change in adherence of 25% over the mDOT group. Subjects were able to use the technology successfully for up to 6 months in an ambulatory setting with early termination rates that are comparable to subjects outside of the substudy. Trial Registration ClinicalTrials.gov NCT01655680 https://clinicaltrials.gov/ct2/show/NCT01655680?term=NCT01655680 PMID:28223265

Emotional Intelligence (EI) relates to one's ability to recognize and understand emotional information and then, to use it for planning and self-management. Given evidence of abnormalities of emotional processing in impulsively aggressive individuals, we hypothesized that EI would be reduced in subjects with Intermittent Explosive Disorder (IED: n = 43) compared with healthy (n = 44) and psychiatric (n = 44) controls. The Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) was used to assess both Experiential EI and Strategic EI. Strategic, but not Experiential, EI was lower in IED compared with control subjects. These differences were not accounted for demographic characteristics, cognitive intelligence, or the presence of clinical syndromes or personality disorder. In contrast, the relationship between IED and Strategic EI was fully accounted for by a dimension of hostile cognition defined by hostile attribution and hostile automatic thoughts. Interventions targeted at improving Strategic EI and reducing hostile cognition will be key to reducing aggressive behavior in individuals with IED. PMID:25477263

Emotional intelligence (EI) relates to one's ability to recognize and understand emotional information and then, to use it for planning and self-management. Given evidence of abnormalities of emotional processing in impulsively aggressive individuals, we hypothesized that EI would be reduced in subjects with Intermittent Explosive Disorder (IED: n = 43) compared with healthy (n = 44) and psychiatric (n = 44) controls. The Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) was used to assess both Experiential EI and Strategic EI. Strategic, but not Experiential, EI was lower in IED compared with control subjects. These differences were not accounted for demographic characteristics, cognitive intelligence, or the presence of clinical syndromes or personality disorder. In contrast, the relationship between IED and Strategic EI was fully accounted for by a dimension of hostile cognition defined by hostile attribution and hostile automatic thoughts. Interventions targeted at improving Strategic EI and reducing hostile cognition will be key to reducing aggressive behavior in individuals with IED.

Under a Small Business Innovation Research contract from Marshall Space Flight Center, Ultrafast, Inc. developed the world's first, high-temperature resistant, "intelligent" fastener. NASA needed a critical-fastening appraisal and validation of spacecraft segments that are coupled together in space. The intelligent-bolt technology deletes the self-defeating procedure of having to untighten the fastener, and thus upset the joint, during inspection and maintenance. The Ultrafast solution yielded an innovation that is likely to revolutionize manufacturing assembly, particularly the automobile industry. Other areas of application range from aircraft, computers and fork-lifts to offshore platforms, buildings, and bridges.

systems, uiophysics of information processing, cognitive science, and traditional artificial intelligence. The objective behi d this objective was to...information processing, cognitive science, and traditional * artificial intelligence. The objective behind this objective was to provide a vehicle for reviewing...Another departure from ’classical’ neurodynamics must be sought in the strong coupling between the micro and macroscopic scales. No other physical mechanism

An objective of the Reagents and Resources component of NCI's Clinical Proteomic Technologies for Cancer Initiative is to generate highly characterized monoclonal antibodies to human proteins associated with cancer.

The concept of plant intelligence, as proposed by Anthony Trewavas, has raised considerable discussion. However, plant intelligence remains loosely defined; often it is either perceived as practically synonymous to Darwinian fitness, or reduced to a mere decorative metaphor. A more strict view can be taken, emphasizing necessary prerequisites such as memory and learning, which requires clarifying the definition of memory itself. To qualify as memories, traces of past events have to be not only stored, but also actively accessed. We propose a criterion for eliminating false candidates of possible plant intelligence phenomena in this stricter sense: an “intelligent” behavior must involve a component that can be approximated by a plausible algorithmic model involving recourse to stored information about past states of the individual or its environment. Re-evaluation of previously presented examples of plant intelligence shows that only some of them pass our test. “You were hurt?” Kumiko said, looking at the scar. Sally looked down. “Yeah.” “Why didn't you have it removed?” “Sometimes it's good to remember.” “Being hurt?” “Being stupid.”—(W. Gibson: Mona Lisa Overdrive) PMID:19816094

To make an academic study of matters inherently secret and potentially explosive seems a tall task. But a growing number of scholars are drawn to understanding spycraft. The interdisciplinary field of intelligence studies is mushrooming, as scholars trained in history, international studies, and political science examine such subjects as the…

Research has shown that differences among ordinary people in intelligence and personality depend equally on individual genetic variability and on differences in the environments that siblings experience within the same family, not differences in the neighborhood, school, and community environments. As of yet, there are no adequate theories to…

Speech intelligibility (SI) is important for different fields of research, engineering and diagnostics in order to quantify very different phenomena like the quality of recordings, communication and playback devices, the reverberation of auditoria, characteristics of hearing impairment, benefit using hearing aids or combinations of these things.

This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinicalprocesses and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinicalprocesses. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinicalprocesses is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinicalprocess managers to detect discrepancies between defined and actual clinicalprocesses and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinicalprocesses and enhance cost control and patient care quality.

We propose a new measure of intelligence for general reinforcement learning agents, based on the notion that an agent's environment can change at any step of execution of the agent. That is, an agent is considered to be interacting with its environment in real-time. In this sense, the resulting intelligence measure is more general than the universal intelligence measure (Legg and Hutter, 2007) and the anytime universal intelligence test (Hernández-Orallo and Dowe, 2010). A major advantage of the measure is that an agent's computational complexity is factored into the measure in a natural manner. We show that there exist agents with intelligence arbitrarily close to the theoretical maximum, and that the intelligence of agents depends on their parallel processing capability. We thus believe that the measure can provide a better evaluation of agents and guidance for building practical agents with high intelligence.

Clinical advancement programs are not evaluated often for effectiveness and participant satisfaction. The advancement committee at this community hospital made the commitment to evaluate participant satisfaction periodically. Revisions were made in the program based on the evaluation findings and implemented in 2002. This re-evaluation was conducted to determine participant satisfaction.

The Clinical Supervision Program for Evaluation, implemented by the Department of Special Education in Quincy (Massachusetts) Public Schools, uses a management-by-objectives approach to measure teaching performance. Discussed are its historical development, philosophy, and procedural sequence. The roles of supervisors and examples of staff member…

Intelligent Agents are being applied in a wide range of processes and everyday applications. Their development is not new, in recent years they have had an increased attention and design; like learning and mentoring tools. In this work we discuss the definition of what an intelligent agent is; how they are applied; how they look like; recent implementations of agents; agents as support in the learning process, more precisely intelligent tutors; their state in Latin-American countries and future developments and trends that will permit a better communication between people and agents. Also we present an Intelligent Tutor applied as a tool for improving high-school students' skills and reasoning for the first five topics of Mechanics curricula.

increases with the demands of near real time accurate intelligence for operational decision-making. Given this environment, intelligence-sharing...operating system providing actionable near-real- time intelligence to commanders for coalition synchronization and the requirement to protect national...real time accurate intelligence for operational decision-making. Given this environment, intelligence-sharing requirements across an ad hoc coalition

Yu et al. (2016) demonstrated that algorithms designed to find efficient routes in standard mazes can be integrated with the natural processes controlling rat navigation and spatial choices, and they pointed out the promise of such "cyborg intelligence" for biorobotic applications. Here, we briefly describe Yu et al.'s work, explore its relevance to the study of comparative cognition, and indicate how work involving cyborg intelligence would benefit from interdisciplinary collaboration between behavioral scientists and engineers.

eventually produce solutions. BY contrast, human beinge and other intelligent animls continuously adapt to the demands and opportunities presented by a...such as monitoring critically ill medical patients or controlling a manufacturing process. Following the model set by human intelligence, we define...signs probabilistically, using a belief network, as well as from first principles, using explicit models of system structure and function. Concurrent

It seems natural to think that the same prudential and ethical reasons for mutual respect and tolerance that one has vis-à-vis other human persons would hold toward newly encountered paradigmatic but nonhuman biological persons. One also tends to think that they would have similar reasons for treating we humans as creatures that count morally in our own right. This line of thought transcends biological boundaries-namely, with regard to artificially (super)intelligent persons-but is this a safe assumption? The issue concerns ultimate moral significance: the significance possessed by human persons, persons from other planets, and hypothetical nonorganic persons in the form of artificial intelligence (AI). This article investigates why our possible relations to AI persons could be more complicated than they first might appear, given that they might possess a radically different nature to us, to the point that civilized or peaceful coexistence in a determinate geographical space could be impossible to achieve.

Issues related to the use of outcome and process data from the treatment of antisocial children to predict future childhood adjustment were examined through a study of 69 children. Data supported the hypothesis that measures of processes thought to produce changes in child behavior would serve to predict future adjustment. (SLD)

Many attempts have been made to correlate degrees of both animal and human intelligence with brain properties. With respect to mammals, a much-discussed trait concerns absolute and relative brain size, either uncorrected or corrected for body size. However, the correlation of both with degrees of intelligence yields large inconsistencies, because although they are regarded as the most intelligent mammals, monkeys and apes, including humans, have neither the absolutely nor the relatively largest brains. The best fit between brain traits and degrees of intelligence among mammals is reached by a combination of the number of cortical neurons, neuron packing density, interneuronal distance and axonal conduction velocity--factors that determine general information processing capacity (IPC), as reflected by general intelligence. The highest IPC is found in humans, followed by the great apes, Old World and New World monkeys. The IPC of cetaceans and elephants is much lower because of a thin cortex, low neuron packing density and low axonal conduction velocity. By contrast, corvid and psittacid birds have very small and densely packed pallial neurons and relatively many neurons, which, despite very small brain volumes, might explain their high intelligence. The evolution of a syntactical and grammatical language in humans most probably has served as an additional intelligence amplifier, which may have happened in songbirds and psittacids in a convergent manner.

Many attempts have been made to correlate degrees of both animal and human intelligence with brain properties. With respect to mammals, a much-discussed trait concerns absolute and relative brain size, either uncorrected or corrected for body size. However, the correlation of both with degrees of intelligence yields large inconsistencies, because although they are regarded as the most intelligent mammals, monkeys and apes, including humans, have neither the absolutely nor the relatively largest brains. The best fit between brain traits and degrees of intelligence among mammals is reached by a combination of the number of cortical neurons, neuron packing density, interneuronal distance and axonal conduction velocity—factors that determine general information processing capacity (IPC), as reflected by general intelligence. The highest IPC is found in humans, followed by the great apes, Old World and New World monkeys. The IPC of cetaceans and elephants is much lower because of a thin cortex, low neuron packing density and low axonal conduction velocity. By contrast, corvid and psittacid birds have very small and densely packed pallial neurons and relatively many neurons, which, despite very small brain volumes, might explain their high intelligence. The evolution of a syntactical and grammatical language in humans most probably has served as an additional intelligence amplifier, which may have happened in songbirds and psittacids in a convergent manner. PMID:26598734

This paper presents a model that aids the Clinical Engineering to deal with Risk Management in the Healthcare Technological Process. The healthcare technological setting is complex and supported by three basics entities: infrastructure (IS), healthcare technology (HT), and human resource (HR). Was used an Enterprise Architecture - MODAF (Ministry of Defence Architecture Framework) - to model this process for risk management. Thus, was created a new model to contribute to the risk management in the HT process, through the Clinical Engineering viewpoint. This architecture model can support and improve the decision making process of the Clinical Engineering to the Risk Management in the Healthcare Technological process.

reasoning, rote memory , and the like.” Standardized tests also fall short in terms of assessing other important aspects of intelligence such as creativity...ipEngine made by BrightStar Engineering. Then in 2004 we further evolved to a Compulab 686 CORE with 128 megabytes of memory running at 266 mHz...driver runs in a continuous loop, timing sonar echoes on each pass, storing the resulting range values in memory for on-demand access by other

Tremendous progress in genetics and genomics led to a wide range of healthcare providers, genetic tests, and more patients who can benefit from these developments. To guarantee and improve the quality of genetic testing, a unified European-based registration for individuals qualified in biomedicine was realized. Therefore a Europe-wide recognition of the profession 'European registered Clinical Laboratory Geneticist (ErCLG)' based on a syllabus of core competences was established which allows for harmonization in professional education. The 'European Board of Medical Genetics division - Clinical Laboratory Geneticist' provides now since 3 years the possibility to register as an ErCLG. Applicants may be from all European countries and since this year also from outside of Europe. Five subtitles reflect the exact specialty of each ErCLG, who can reregister every 5 years. A previously not possible statistics based on ~300 individuals from 19 countries as holders of an ErCLG title provides interesting insights into the professionals working in human genetics. It could be substantiated that there are around twice as many females than males and that a PhD title was achieved by 80% of registered ErCLGs. Also most ErCLGs are still trained as generalists (66%), followed by such ErCLGs with focus on molecular genetics (23%); the remaining are concentrated either on clinical (6%), tumor (4%) or biochemical genetics (1%). In conclusion, besides MDs and genetic counselors/nurses an EU-wide recognition system for Clinical Laboratory Geneticist has been established, which strengthens the status of specialists working in human genetic diagnostics in Europe and worldwide.European Journal of Human Genetics advance online publication, 8 March 2017; doi:10.1038/ejhg.2017.25.

A diagnosis test is carried out to establish the presence of health or illness. In the latter it could grade the severity. Due to its importance in clinical decisions, the diagnosis test is evaluated by mathematical strategies. We estimate the sensitivity and specificity once we know the existence or not of the disease, but we act in the reverse direction; with the presence "X" test positive or negative we estimate the presence of the disease, therefore, we use the positive and negative predictive values. Mathematical strategy allow us to quantify the observation, but it requires judgment to determine the quality making use of a minimum of features: a) selection under the same criteria for cases and controls; b) the inclusion of the full spectrum of disease severity (from mild to the most serious, ensuring that all levels have an enough number of subjects); c) the interpretation of both, the gold standard and the new tool of diagnosis, it must be blind and conducted by experts; d) the interpretation of results should show us what is their application in everyday clinical practice; e) the reproducibility must be checked. Do not forget that usually, we treat only one patient at once, what enforce us to have full knowledge of the performance of the diagnostic test, and to consider all clinical aspects for its proper implementation.

Background We describe an initial clinical assessment of a new, commercially available, computer-aided diagnosis (CAD) system using artificial intelligence (AI) for thyroid ultrasound, and evaluate its performance in the diagnosis of malignant thyroid nodules and categorization of nodule characteristics. Methods This prospective study protocol was reviewed and approved by the institutional review board. Patients with thyroid nodules with decisive diagnosis, whether benign or malignant on the basis of cytopathologic or US results, were consecutively enrolled from November 2015 to February 2016. An experienced radiologist reviewed the ultrasound image characteristics of the thyroid nodules, while another radiologist assessed the same thyroid nodules using the CAD system, providing ultrasound characteristics and a diagnosis of whether nodules were benign or malignant. We compared the diagnostic performance and agreement of US characteristics between experienced radiologist and the CAD system. Results In total, 102 thyroid nodules from 89 patients were included; 59 (57.8%) were benign and 43 (42.2%) were malignant. The CAD system showed a similar sensitivity as the experienced radiologist (sensitivity: 90.7% versus 88.4%, P>0.99), but a lower specificity, and a lower area under the receiver operating characteristic (AUROC) curve (specificity: 74.6% versus 94.9%, P=0.002; AUROC: 0.83 versus 0.92, P=0.021). Classifications of the ultrasound characteristics (composition, orientation, echogenicity, and spongiform) between radiologist and CAD system were in substantial agreement (kappa=0.659, 0.740, 0.733, and 0.658, respectively), while margin definition showed a fair agreement (kappa=0.239). Conclusion The sensitivity of the CAD system using AI for malignant thyroid nodules was as good as that of the experienced radiologist, while specificity and accuracy were lower than those of the experienced radiologist. The CAD system showed an acceptable agreement with the

This paper proposes the development of intelligent sensors as part of an integrated systems approach, i.e. one treats the sensors as a complete system with its own sensing hardware (the traditional sensor), A/D converters, processing and storage capabilities, software drivers, self-assessment algorithms, communication protocols and evolutionary methodologies that allow them to get better with time. Under a project being undertaken at the NASA s Stennis Space Center, an integrated framework is being developed for the intelligent monitoring of smart elements. These smart elements can be sensors, actuators or other devices. The immediate application is the monitoring of the rocket test stands, but the technology should be generally applicable to the Integrated Systems Health Monitoring (ISHM) vision. This paper outlines progress made in the development of intelligent sensors by describing the work done till date on Physical Intelligent Sensors (PIS). The PIS discussed here consists of a thermocouple used to read temperature in an analog form which is then converted into digital values. A microprocessor collects the sensor readings and runs numerous embedded event detection routines on the collected data and if any event is detected, it is reported, stored and sent to a remote system through an Ethernet connection. Hence the output of the PIS is data coupled with confidence factor in the reliability of the data which leads to information on the health of the sensor at all times. All protocols are consistent with IEEE 1451.X standards. This work lays the foundation for the next generation of smart devices that have embedded intelligence for distributed decision making capabilities.

Discusses a research project that uses artificial intelligence techniques to help teach programing. Describes principles and implementation of the LISP Intelligent Tutoring System (LISPITS). Explains how the artificial intelligence technique was developed and possible future research. (MVL)

Good Evening, my name is Greg Jerman and for nearly a quarter century I have been performing failure analysis on NASA's aerospace hardware. During that time I had the distinct privilege of keeping the Space Shuttle flying for two thirds of its history. I have analyzed a wide variety of failed hardware from simple electrical cables to cryogenic fuel tanks to high temperature turbine blades. During this time I have found that for all the time we spend intelligently designing things, we need to be equally intelligent about understanding why things fail. The NASA Flight Director for Apollo 13, Gene Kranz, is best known for the expression "Failure is not an option." However, NASA history is filled with failures both large and small, so it might be more accurate to say failure is inevitable. It is how we react and learn from our failures that makes the difference.

Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

It has been proposed that the well-established relationship between working memory (WM) and fluid intelligence (gf) is mediated by executive mechanisms underlying interference control. The latter relies upon the integrity of a frontoparietal brain network, whose activity is modulated by general cognition. In regards to the chronology of this activation, only few EEG studies investigated the topic, although none of them examined the regional interaction or the effects of individual differences in gf. The current investigation sought at extending previous research by characterizing the EEG markers (temporal activation and regional coupling) of interference control and the effects of the individual variation in gf. To this end, we recorded the EEG activity of 33 participants while performing verbal and spatial versions of a 3-back WM task. In a separate session, participants were administered with a test of fluid intelligence. Interference-inducing trials were associated with an increased negativity in the frontal scalp region occurring in two separate time windows and probably reflecting two different stages of the underlying cognitive process. In addition, we found that scalp distribution of such activity differed among individuals, being the strongest activation of the left and right frontolateral sites related to high gf level. Finally, high- and low-gf participants showed different patterns in the modulation of regional connectivity (electrodes coherence in the range of 4.5-7.5Hz) according to changes in attention load among types of trials. Our findings suggest that high-gf participants may rely upon effective engagement and modulation of attention resources to face interference.

A knowledge based system to assist process engineers in evaluating the processability and moldability of poly-isocyanurate (PIR) formulations for the thermal protection system of the Space Shuttle external tank (ET) is discussed. The Reaction Injection Molding- Process Development Advisor (RIM-PDA) is a coupled system which takes advantage of both symbolic and numeric processing techniques. This system will aid the process engineer in identifying a startup set of mold schedules and in refining the mold schedules to remedy specific process problems diagnosed by the system.

Objectives We examine interactions among 3 factors that affect patient waits and use of overtime in outpatient clinics: clinic congestion, patient punctuality and physician processing rates. We hypothesise that the first 2 factors affect physician processing rates, and this adaptive physician behaviour serves to reduce waiting times and the use of overtime. Setting 2 urban academic clinics and an affiliated suburban clinic in metropolitan Baltimore, Maryland, USA. Participants Appointment times, patient arrival times, start of service and physician processing times were collected for 105 visits at a low-volume suburban clinic 1, 264 visits at a medium-volume academic clinic 2 and 22 266 visits at a high-volume academic clinic 3 over 3 distinct spans of time. Intervention Data from the first clinic were previously used to document an intervention to influence patient punctuality. This included a policy that tardy patients were rescheduled. Primary and secondary outcome measures Clinicians' processing times were gathered, conditioned on whether the patient or clinician was tardy to test the first hypothesis. Probability distributions of patient unpunctuality were developed preintervention and postintervention for the clinic in which the intervention took place and these data were used to seed a discrete-event simulation. Results Average physician processing times differ conditioned on tardiness at clinic 1 with p=0.03, at clinic 2 with p=10−5 and at clinic 3 with p=10−7. Within the simulation, the adaptive physician behaviour degrades system performance by increasing waiting times, probability of overtime and the average amount of overtime used. Each of these changes is significant at the p<0.01 level. Conclusions Processing times differed for patients in different states in all 3 settings studied. When present, this can be verified using data commonly collected. Ignoring these behaviours leads to faulty conclusions about the efficacy of efforts to improve

The Automation Technology Branch of NASA Langley Research Center is developing a research capability in the field of artificial intelligence, particularly as applicable in teleoperator/robotics development for remote space operations. As a testbed for experimentation in these areas, a system concept has been developed and is being implemented. This system, termed DAISIE (Distributed Artificially Intelligent System for Interacting with the Environment), interfaces the key processes of perception, reasoning, and manipulation by linking hardware sensors and manipulators to a modular artificial intelligence (AI) software system in a hierarchical control structure. Verification experiments have been performed: one experiment used a blocksworld database and planner embedded in the DAISIE system to intelligently manipulate a simple physical environment; the other experiment implemented a joint-space collision avoidance algorithm. Continued system development is planned.

A growing body of literature investigates heterosexual donor conception and there is now also a small body of work which investigates the experiences of single women and lesbian couples. Both of these focus on a clinical setting. Women, notably single and lesbians, also undertake non-clinical donor conception, and insufficient consideration has been paid to these self-arranged reproductive practices, and how they may compare with the clinical ones. Seeking to fill this gap, this paper explores women's experiences of accessing donor sperm inside and outside reproductive health clinics by drawing on a qualitative interview study with 25 lesbian couples in England and Wales with experiences of jointly pursuing donor conception. The paper explores the differences embedded in the two conception routes with regard to donor recruitment, access to donor sperm over time, space and the management of sperm as a bodily fluid. Utilising the framework of 'ontological choreography' developed by Thompson (2005), as well as Douglas's (1966) work around bodies, dirt and disgust, the paper argues that the clinic functions as a containment for legal as well as practical and bodily dimensions of donor conception, and this in turn shapes practices and perceptions of self-arranged conception.

Reviews the development of artificial intelligence systems and the mechanisms used, including knowledge representation, programing languages, and problem processing systems. Eleven books and 6 journals are listed as sources of information on artificial intelligence. (23 references) (CLB)

The Wechsler intelligence scale for children--fourth edition (WISC-IV) Integrated contains the WISC-IV core and supplemental subtests along with process approach subtests designed to facilitate a process-oriented approach to score interpretation. The purpose of this study was to examine the extent to which WISC-IV Integrated subtests measure the constructs they are purported to measure. In addition to examining the measurement and scoring model provided in the manual, this study also tested hypotheses regarding Cattell-Horn-Carroll abilities that might be measured along with other substantive questions regarding the factor structure of the WISC-IV Integrated and the nature of abilities measured by process approach subtests. Results provide insight regarding the constructs measured by these subtests. Many subtests appear to be good to excellent measures of psychometric g (i.e., the general factor presumed to cause the positive correlation of mental tasks). Other abilities measured by subtests are described. For some subtests, the majority of variance is not accounted for by theoretical constructs included in the scoring model. Modifications made to remove demands such as memory recall and verbal expression were found to reduce construct-irrelevant variance. The WISC-IV Integrated subtests appear to measure similar constructs across ages 6-16, although strict factorial invariance was not supported.

The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

Studies have shown that IQ alone does not contribute to the professional success of medical professionals. Professionals who are trained to be clinically competent, but have inadequate social skills for practice have proved to be less successful in their profession. Emotional intelligence (EI), which has already proved to be a key attribute for…

Geospatial Intelligence Analysts are currently faced with an enormous volume of imagery, only a fraction of which can be processed or reviewed in a timely operational manner. Computer-based target detection efforts have failed to yield the speed, flexibility and accuracy of the human visual system. Rather than focus solely on artificial systems, we hypothesize that the human visual system is still the best target detection apparatus currently in use, and with the addition of neuroscience-based measurement capabilities it can surpass the throughput of the unaided human severalfold. Using electroencephalography (EEG), Thorpe et al1 described a fast signal in the brain associated with the early detection of targets in static imagery using a Rapid Serial Visual Presentation (RSVP) paradigm. This finding suggests that it may be possible to extract target detection signals from complex imagery in real time utilizing non-invasive neurophysiological assessment tools. To transform this phenomenon into a capability for defense applications, the Defense Advanced Research Projects Agency (DARPA) currently is sponsoring an effort titled Neurotechnology for Intelligence Analysts (NIA). The vision of the NIA program is to revolutionize the way that analysts handle intelligence imagery, increasing both the throughput of imagery to the analyst and overall accuracy of the assessments. Successful development of a neurobiologically-based image triage system will enable image analysts to train more effectively and process imagery with greater speed and precision.

The paper is focused on the security issues of sensors provided with processors and software and used for high-risk applications. Common IT related threats may cause serious consequences for sensor system users. To improve their robustness, sensor systems should be developed in a restricted way that would provide them with assurance. One assurance creation methodology is Common Criteria (ISO/IEC 15408) used for IT products and systems. The paper begins with a primer on the Common Criteria, and then a general security model of the intelligent sensor as an IT product is discussed. The paper presents how the security problem of the intelligent sensor is defined and solved. The contribution of the paper is to provide Common Criteria (CC) related security design patterns and to improve the effectiveness of the sensor development process. PMID:22315571

This paper surveys important aspects of Web Intelligence (WI) in the context of Artificial Intelligence in Education (AIED) research. WI explores the fundamental roles as well as practical impacts of Artificial Intelligence (AI) and advanced Information Technology (IT) on the next generation of Web-related products, systems, services, and…

The PV Manufacturing R&D (PVMR&D) Project conducts cost-shared research and development programs with U.S. PV industry partners. There are currently two active industry partnership activities. ''In-line Diagnostics and IntelligentProcessing'', launched in 2002, supports development of new in-line diagnostics and monitoring with real-time feedback for optimal process control and increased yield in the fabrication of PV modules, systems, and other system components. ''Yield, Durability and Reliability'', launched in late 2004, supports enhancement of PV module, system component, and complete system reliability in high-volume manufacturing. A second key undertaking of the PVMR&D Project is the collection and analysis of module production cost-capacity metrics for the U.S. PV industry. In the period from 1992 through 2005, the average module manufacturing cost in 2005 dollars fell 54% (5.7% annualized) to $2.74/Wp, and the capacity increased 18.6-fold (25% annualized) to 253 MW/yr. An experience curve analysis gives progress ratios of 87% and 81%, respectively, for U.S. silicon and thin-film module production.

A survey is made of intelligence research in the 125 years of The American Journal of Psychology. There are some major articles of note on intelligence, especially Spearman's (1904a) article that discovered general cognitive ability (g). There are some themes within intelligence on which articles appeared over the years, such as processing speed, age, and group differences. Intelligence has not been a major theme of the journal, nor has a differential approach to psychology more generally. There are periods of time--especially the 1970s--during which almost no articles appeared on intelligence. The key articles and themes on intelligence differences are discussed in detail.

Intelligent tutoring systems are any computer systems encompassing interactive applications with some intelligence that support and facilitate the teaching-learning process. The intelligence of these systems is the ability to adapt to each student throughout his/her learning process. This paper presents an intelligent tutoring system, called…

Background: There will always be a place for stuttering treatments designed to eliminate or reduce stuttered speech. When those treatments are required, direct speech measures of treatment process and outcome are needed in clinical practice. Aims: Based on the contents of published clinical trials of such treatments, three "core" measures of…

A self teaching image processing and voice recognition based system is developed to educate visually impaired children, chiefly in their primary education. System comprises of a computer, a vision camera, an ear speaker and a microphone. Camera, attached with the computer system is mounted on the ceiling opposite (on the required angle) to the desk on which the book is placed. Sample images and voices in the form of instructions and commands of English, Urdu alphabets, Numeric Digits, Operators and Shapes are already stored in the database. A blind child first reads the embossed character (object) with the help of fingers than he speaks the answer, name of the character, shape etc into the microphone. With the voice command of a blind child received by the microphone, image is taken by the camera which is processed by MATLAB® program developed with the help of Image Acquisition and Image processing toolbox and generates a response or required set of instructions to child via ear speaker, resulting in self education of a visually impaired child. Speech recognition program is also developed in MATLAB® with the help of Data Acquisition and Signal Processing toolbox which records and process the command of the blind child.

This book examines the fundamental concepts and various techniques involved in real-world applications of artifical intelligence, such as computer representation of problems, searching and inference mechanisms, intelligent languages, and computer representation and utilization of knowledge. It explores applications in theorem-proving, games, robotics, natural language processing, knowledge engineering, and more.

The effects of incidental learning were assessed in 2 experiments involving 201 seventh graders in Poland through an experimental paradigm based on the levels of processing theory. Data suggest that an important aspect of intelligence is "opportunistic" learning (learning in advance). Intelligent people take cognitive advantage of…

The hypothesis that performance on implicit learning tasks is unrelated to psychometric intelligence was examined in a sample of 605 German pupils. Performance in artificial grammar learning, process control, and serial learning did not correlate with various measures of intelligence when participants were given standard implicit instructions.…

This article explores the tensions between Howard Gardner's theory of multiple intelligences and current educational policies emphasizing standardized and predictable outcomes. The article situates Gardner's theory within the historical interests among psychometricians in identifying those core processes that constitute human intelligence.…

Clinicians can use the base rates of low scores in healthy people to reduce the likelihood of misdiagnosing cognitive impairment. In the present study, base rates were developed for the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) and Wechsler Memory Scale-Fourth Edition (WMS-IV) using 900 healthy adults and validated on 28 patients…

Deficits in social cognition, including emotional processing, are hallmarks of schizophrenia and antipsychotic agents seem to be ineffectual to improve these symptoms. However, oxytocin does seem to have beneficial effects on social cognition. The aim of this study was to examine the effects of four months of treatment with intranasal oxytocin, in 31 patients with schizophrenia, on distinct aspects of social cognition. This was assessed using standardized and experimental tests in a randomized, double-blind, placebo-controlled, cross-over trial. All patients underwent clinical and experimental assessment before treatment, four months after treatment and at the end of treatment. Social cognition abilities were assessed with the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) and the Reading the Mind in the Eyes task (RMET). Furthermore, an Emotional Priming Paradigm (EPP) was developed to examine the effects of oxytocin on implicit perceptual sensitivity to affective information and explicit facial affect recognition. We found that oxytocin improved performance on MSCEIT compared to placebo in Branch 3-Understanding Emotion (p-value=0.004; Cohen׳s d=1.12). In the EPP task, we observed a significant reduction of reaction times for facial affect recognition (p-value=0.021; Cohen׳s d=0.88). No effects were found for implicit priming or for theory of mind abilities. Further study is required in order to highlight the potential for possible integration of oxytocin with antipsychotic agents as well as to evaluate psycho-social treatment as a multi-dimensional approach to increase explicit emotional processing abilities and compensate social cognition deficits related to schizophrenia.

Following the Japanese announcement that they intend to devise, make, and market, in the 1990s, computers incorporating a level of intelligence, a vast amount of energy and expense has been diverted at the field of Artificial Intelligence. Workers for the past 25 years in this discipline have tried to reproduce human behavior on computers and this book presents their achievements and the problems. Subjects include: computer vision, speech processing, robotics, natural language processing expert systems and machine learning. The book also attempts to show the general principles behind the various applications and finally attempts to show their implications for other human endeavors such as philosophy, psychology, and the development of modern society.

Background The intelligence of individuals with Autism Spectrum Disorder (ASD) varies considerably. The pattern of cognitive deficits associated with ASD may differ depending on intelligence. We aimed to study the absolute and relative severity of cognitive deficits in participants with ASD in relation to IQ. Methods A total of 274 children (M age = 12.1, 68.6% boys) participated: 30 ASD and 22 controls in the below average Intelligence Quotient (IQ) group (IQ<85), 57 ASD and 54 controls in the average IQ group (85115). Matching for age, sex, Full Scale IQ (FSIQ), Verbal IQ (VIQ), Performance IQ (PIQ) and VIQ-PIQ difference was performed. Speed and accuracy of social cognition, executive functioning, visual pattern recognition and basic processing speed were examined per domain and as a composite score. Results The composite score revealed a trend significant IQ by ASD interaction (significant when excluding the average IQ group). In absolute terms, participants with below average IQs performed poorest (regardless of diagnosis). However, in relative terms, above average intelligent participants with ASD showed the most substantial cognitive problems (particularly for social cognition, visual pattern recognition and verbal working memory) since this group differed significantly from the IQ-matched control group (p < .001), whereas this was not the case for below-average intelligence participants with ASD (p = .57). Conclusions In relative terms, cognitive deficits appear somewhat more severe in individuals with ASD and above average IQs compared to the below average IQ patients with ASD. Even though high IQ ASD individuals enjoy a certain protection from their higher IQ, they clearly demonstrate cognitive impairments that may be targeted in clinical assessment and treatment. Conversely, even though in absolute terms ASD patients with below average IQs were clearly more impaired than ASD patients

A computer model (CONDUCT) has been developed that simulates corps and subordinate command, control, communications, and intelligence C3I functions with particular emphasis on the integration of the new generation of intelligence, surveillance, and target-acquisition systems within the developing 1982 and 1986 force structure. CONDUCT is an event-by-event simulation model written in GPSS-V (General Purpose Simulation System), representing the combat and combat support command/staff elements and communications nodes/nets for the operations and intelligence functions within a type corps. Maneuver and engineer units are represented to platoon level, artillery units to battery level, and target-acquisition and Combat Electronics Warfare Intelligence (CEWI) units to sensor team level. Major command posts and operations centers are subdivided into their primary functional areas. Also given are results from the initial 16 hr combat simulation.

This study examined the relationship between sensory processing difficulties, parental stress, and behavioral problems in a clinical sample of young children with developmental and behavioral difficulties. We hypothesized that a high rate of sensory processing difficulties would be found, that there would be a high rate of comorbidity between sensory processing difficulties and behavioral problems, and that children’s sensory processing difficulties and parental stress would be highly correlated. Parents of 59 children ages two to five who attended an out-patient clinic in a low income, urban community completed the Child Behavior Checklist, Parental Stress Inventory-Short Form and the Short Sensory Profile. Children in this clinical population showed a high prevalence (55.9%) of sensory processing difficulties, a significantly higher rate than previously reported. Sensory processing deficits were correlated with behavioral difficulties and parental stress levels-suggesting that as sensory processing difficulties increase, so do behavioral difficulties and parental stress. Parents of children with sensory processing deficits had significantly higher levels of parental stress than parents of children without sensory deficits. Parenting stress levels were also clinically elevated for the cohort of children in which sensory processing difficulties and behavioral concerns co-existed. These findings suggest that treatment outcomes might improve and parental stress could be reduced if mental health clinicians were trained to identify and address sensory problems. This could result in more children being screened and treated for sensory processing difficulties and an eventual reduction in the rates of parental stress. PMID:24443636

It has long been under discussion how the teaching and learning environment should be arranged, how individuals achieve learning, and how teachers can effectively contribute to this process. Accordingly, a considerable number of theories and models have been proposed. Gardner (1983) caused a remarkable shift in the perception of learning theory as…

AU/ACSC/9694/2008-09 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY BIRDS OF A FEATHER Moving Towards a Joint Acquisition Process to Support the...number. 1. REPORT DATE APR 2009 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE BIRDS OF A FEATHER: Moving Towards a Joint...INTRODUCTION ...........................................................................................................................1 " Birds of a

Recent advances in three research traditions are summarized: trait theories of intelligence, information-processing theories of intelligence, and general theories of thinking. Work on fluid and crystallized abilities by J. Horn and R. Snow, mental speed, spatial visualization, cognitive psychology, artificial intelligence, and the construct of…

This book is concerned with reinventing the learning process from a multiple intelligences perspective and urges explicitly teaching students about multiple intelligences to further their metacognitive understanding. The multiple-intelligence-based curriculum is intended to interface with the regular academic curriculum. An introductory chapter…

This study reports the profiling of volatile compounds generated during microwave-assisted chemical pre-treatment of sorghum leaves. Compounds including acetic acid (0-186.26ng/g SL), furfural (0-240.80ng/g SL), 5-hydroxymethylfurfural (HMF) (0-19.20ng/g SL) and phenol (0-7.76ng/g SL) were detected. The reducing sugar production was optimized. An intelligent model based on Artificial Neural Networks (ANNs) was developed and validated to predict a profile of 21 volatile compounds under novel pre-treatment conditions. This model gave R(2)-values of up to 0.93. Knowledge extraction revealed furfural and phenol exhibited high sensitivity to acid- and alkali concentration and S:L ratio, while phenol showed high sensitivity to microwave duration and intensity. Furthermore, furfural production was majorly dependent on acid concentration and fit a dosage-response relationship model with a 2.5% HCl threshold. Significant non-linearities were observed between pre-treatment conditions and the profile of various compounds. This tool reduces analytical costs through virtual analytical instrumentation, improving process economics.

Auditory processing disorders may have detrimental consequences on a child's life, if undiagnosed and untreated. We review causes of auditory processing disorders in order to raise clinical awareness. Auditory processing disorders may present against a background of neurological disease or developmental disorders, as well as in isolation. Clinicians need to be aware of potential causes and implications of auditory processing disorders. PMID:11668093

The general principles of artificial intelligence are reviewed and speculations are made concerning how knowledge based systems can accelerate the process of acquiring new knowledge in aerodynamics, how computational fluid dynamics may use expert systems, and how expert systems may speed the design and development process. In addition, the anatomy of an idealized expert system called AERODYNAMICIST is discussed. Resource requirements for using artificial intelligence in computational fluid dynamics and aerodynamics are examined. Three main conclusions are presented. First, there are two related aspects of computational aerodynamics: reasoning and calculating. Second, a substantial portion of reasoning can be achieved with artificial intelligence. It offers the opportunity of using computers as reasoning machines to set the stage for efficient calculating. Third, expert systems are likely to be new assets of institutions involved in aeronautics for various tasks of computational aerodynamics.

such as sunspots, solar flares , coronal holes, and the resulting processes within the ionosphere, these effects can occur at all locations on earth...intense during solar maximum. The effect of disrupting a Global Positioning System signal is even more intermittent and more short-lived than that...though the implications of the space weather effects may differ . Also, the Department of Defense Office of the Space Architect recently began a study

Students who view intelligence as malleable tend to be more academically motivated and perform at higher levels than students who view it as a fixed trait. We examined the beliefs of students from Malawi because the culture and schooling process in this country is very different from some other areas of the world in which students' views of intelligence have already been studied. Our research questions were: (1) How do Malawian students define intelligence? (2) To what extent do Malawian students view intelligence as malleable? (3) Are Malawian students' definitions of intelligence and beliefs about the malleability of intelligence similar to those of students in more developed countries? We conducted a mixed methods study and surveyed 136 students attending a secondary school in Malawi using a 39-item questionnaire. Students responded to questions about their intelligence beliefs on open- and closed-ended items. Our results showed that Malawian students believe that an intelligent student exhibits a variety of behaviors, including studying, working hard, reading, performing well on exams and in class, answering and asking questions, paying attention, and demonstrating good behavior. Most students believe that intelligence is malleable and provided responses that indicated that students can become more intelligent through effort. When compared to the findings of other studies, the present results suggest that the Malawian students who remain in secondary school have definitions of intelligence and beliefs about the malleability of intelligence that are similar to those of students in more developed countries, such as the US and Germany. In fact, it appears that Malawian secondary students have even higher malleable beliefs than American and German students. Finally, some of the measures that have been found to produce scores that are reliable and valid in other populations do not produce scores that are as reliable when used with Malawian students.

Transitioning to digital imaging operations in a department of radiology is often difficult for many radiologists, but it is a change that many have made effectively. Transitioning to digital operations in a clinic setting is even more difficult for the referring physician operating a business in the clinic. This paper will discuss our experience with transitioning several off site clinics to digital imaging operations. We will discuss the process followed to identify the physical equipment required to support clinic operations in a digital imaging environment, the process followed to help the physicians adjust their work patterns to allow them to practice in a digital imaging environment, and the benefits and pitfalls of implementing digital imaging in an off site clinic. Four off site clinic locations will be evaluated: 1. cancer clinic located immediately adjacent to the main hospital that relies heavily on CT and MRI images in their practice, 2. small clinic located about 60 miles from the main hospital that acquires xray images on site, 3. larger clinic located about 20 miles from the main hospital that acquires xray, MRI and CT images on site, 4. sports medicine clinic located about 2 miles from the main hospital that acquires xray images on site. Each of these clinics has a very different patient clientele and therefore operates differently in nearly all aspects of their daily operations. The physician's need for and use of film and digital images varies significantly between the sites and therefore each site has presented different challenges to our implementation process. As we explain the decisions that were made for each of these sites and reveal the methods that were used to help the physicians make the transition, the readers should be able to draw information that will be helpful to them as they make their own transition to a digital operation.

This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

The research and development of AI are discussed. Papers are presented on an expert system for chemical process control, an ocean surveillance information fusion expert system, a distributed intelligence system and aircraft pilotage, a procedure for speeding innovation by transferring scientific knowledge more quickly, and syntax programming, expert systems, and real-time fault diagnosis. Consideration is given to an expert system for modeling NASA flight control room usage, simulating aphasia, a method for single neuron recognition of letters, numbers, faces, and certain types of concepts, integrating AI and control system approach, testing an expert system for manufacturing, and the human memory.

This paper reports on a qualitative descriptive study that explored student midwives' experiences of the Objective Structured Clinical Examination assessment process for obstetric emergencies within a university setting. The development of fundamental clinical skills is an important component in preparing students to meet the responsibilities of a midwife. There is an international concern that the transfer of midwifery education into universities may impact on the development of midwifery clinical skills. Objective Structured Clinical Examinations (OSCEs) have the potential to promote integration and consolidation of skills prior to clinical placement. Twenty six students (n=36) from two midwifery programmes (BSc and Higher Diploma) participated in four focus groups and Burnard's (2006) framework was used for data analysis. Three main themes emerged following analysis: preparation for the OSCE assessment, the OSCE process and learning through simulating practice. Preparation for the OSCE's which included lectures, demonstrations, and practice of OSCE's facilitated by lecturers and by the students themselves, was considered central to the process. Learning via OSCEs was perceived to be more effective in comparison to other forms of assessment and prepared students for clinical practice. Positive aspects of the process and areas for improvement were identified. Using OSCE's increased the depth of learning for the students with the steps taken in preparation for the OSCE's proving to be a valuable learning tool. This study adds to the evidence on the use of OSCE's in midwifery education.

Clinical supervision should be a proactive and considered endeavor, not a reactive one. To that end, supervisors should choose supervision processes that are driven by theory, best available research, and clinical experience. These processes should be aimed at helping trainees develop as clinicians. We highlight 3 supervision processes we believe should be used at each supervision meeting: agenda setting, encouraging trainee problem-solving, and formative feedback. Although these are primarily cognitive-behavioral skills, they can be helpful in combination with other supervision models. We provide example dialogue from supervision exchanges, and discuss theoretical and research support for these processes. Using these processes not only encourages trainee development but also models for them how to use the same processes and approaches with clients.

Discusses the need for education programs for competitive intelligence professionals. Highlights include definitions of intelligence functions, focusing on business intelligence; information utilization by decision makers; information sources; competencies for intelligence professionals; and the development of formal education programs. (38…

Purpose is an internal compass that integrates engagement in activities that affect others, self-awareness of one's reasons, and the intention to continue these activities. We argue that purpose represents giftedness in intrapersonal intelligence, which processes information related to self, identity, self-regulation, and one's place in the world.…

Many researchers are attempting to develop automated instructional development systems to guide subject matter experts through the lengthy and difficult process of courseware development. Because the targeted users often lack instructional design expertise, a great deal of emphasis has been placed on the use of artificial intelligence (AI) to…

The Haptic Visual Discrimination Test of tactual-visual information processing was administered to 39 first-graders, along with standard intelligence, academic potential, and spatial integration tests. Results revealed consistently significant associations between the importance of parieto-occipital areas for organizing sensory data as well as for…

Artificial intelligence (AI) is the field of scientific inquiry concerned with designing machine systems that can simulate human mental processes. The field draws upon theoretical constructs from a wide variety of disciplines, including mathematics, psychology, linguistics, neurophysiology, computer science, and electronic engineering. Some of the…

Multiple Intelligence Based Teaching (MIBT) applies the multiple intelligence theory in the process of teaching and learning. MIBT explores and develops the intelligence of the students. Also, it teaches the content in a multiple way to the students. The objective of the present study is to find out the effectiveness of multiple intelligence based…

Clinical practice guidelines (CPGs), representing the current best-practice guidelines and recommendations for care, are supported by systematic review and evidence-based research. CPGs provide an effective and efficient approach to caring for patients and improving quality of care. Recently, the National Health Insurance Administration and National Institutes of Health developed CPGs for major diseases in Taiwan. This paper introduces the process that was used to develop one of these CPGs, the Taiwan Chronic Kidney Disease Clinical Guidelines, which was published in 2015. Further, we introduce the general development of published nursing guidelines in Taiwan. These CPGs are expected to initiate various renal-care guidelines and to promote the quality of renal care in the country.

What role, if any, should emotions play in clinical reasoning and decision making? Traditionally, emotions have been excluded from clinical reasoning and decision making, but with recent advances in cognitive neuropsychology they are now considered an important component of them. Today, cognition is thought to be a set of complex processes relying on multiple types of intelligences. The role of mathematical logic (hypothetico-deductive thinking) or verbal linguistic intelligence in cognition, for example, is well documented and accepted; however, the role of emotional intelligence has received less attention-especially because its nature and function are not well understood. In this paper, I argue for the inclusion of emotions in clinical reasoning and decision making. To that end, developments in contemporary cognitive neuropsychology are initially examined and analyzed, followed by a review of the medical literature discussing the role of emotions in clinical practice. Next, a published clinical case is reconstructed and used to illustrate the recognition and regulation of emotions played during a series of clinical consultations, which resulted in a positive medical outcome. The paper's main thesis is that emotions, particularly in terms of emotional intelligence as a practical form of intelligence, afford clinical practitioners a robust cognitive resource for providing quality medical care.

This book on the genetic and environmental influences on intelligence is comprised of the following papers: "The Structure of Intelligence in Relation to the Nature-Nurture Controversy," R. B. Cattell; "Theory of Intelligence," L. G. Humphreys; "Using Measured IntelligenceIntelligently," P. R. Merrifield; "Intelligence: Definition, Theory, and…

Although clinical practice guidelines (CPGs) have been suggested as a means of encapsulating best practice in evidence-based medical treatment, their usage in clinical environments has been disappointing. Criticisms of guideline representations have been that they are predominantly narrative and are difficult to incorporate into clinical information systems. This paper analyses the use of UML process modelling techniques for guideline representation and proposes the automated generation of executable guidelines using XMI. This hybrid UML-XMI approach provides flexible authoring of guideline decision and control structures whilst integrating appropriate data flow. It also uses an open XMI standard interface to allow the use of authoring tools and process control systems from multiple vendors. The paper first surveys CPG modelling formalisms followed by a brief introduction to process modelling in UMI. Furthermore, the modelling of CPGs in UML is presented leading to a case study of encoding a diabetes mellitus CPG using UML.

The Intelligent Virtual Station (IVS) is enabling the integration of design, training, and operations capabilities into an intelligent virtual station for the International Space Station (ISS). A viewgraph of the IVS Remote Server is presented.

The volume and the complexity of clinical and administrative information make Information and Communication Technologies (ICTs) essential for running and innovating healthcare. This paper tells about a project aimed to design, develop and implement a set of organizational models, acknowledged procedures and ICT tools (Mobile & Wireless solutions and Automatic Identification and Data Capture technologies) to improve actual support, safety, reliability and traceability of a specific therapy management (stem cells). The value of the project is to design a solution based on mobile and identification technology in tight collaboration with physicians and actors involved in the process to ensure usability and effectivenes in process management.

Discusses the three areas of research and application of artificial intelligence: (1) robotics, (2) natural language processing, and (3) knowledge-based or expert systems. Focuses on what expert systems can do, especially in the area of training. (JOW)

Intelligence has evolved many times independently among vertebrates. Primates, elephants and cetaceans are assumed to be more intelligent than 'lower' mammals, the great apes and humans more than monkeys, and humans more than the great apes. Brain properties assumed to be relevant for intelligence are the (absolute or relative) size of the brain, cortex, prefrontal cortex and degree of encephalization. However, factors that correlate better with intelligence are the number of cortical neurons and conduction velocity, as the basis for information-processing capacity. Humans have more cortical neurons than other mammals, although only marginally more than whales and elephants. The outstanding intelligence of humans appears to result from a combination and enhancement of properties found in non-human primates, such as theory of mind, imitation and language, rather than from 'unique' properties.

Background Forming one’s identity is thought to be the key developmental task of adolescence, but profound changes in personality traits also occur in this period. The negotiation of complex social settings, the creation of an integrated identity, and career choice are major tasks of adolescence. The adolescent, having to make choices for his or her future, has not only to consider his or her own aspirations and interests but also to possess a capacity for exploration and commitment; in fact, career commitments can be considered as a fit between the study or career that is chosen and personal values, skills, and preferences. Methods The objective of the study reported here was to investigate the role of identity on profile of interests; the relation between identity and decisional style; the correlation between identity, aptitudes, interests, and school performance; and the predictive variables to school success. The research involved 417 Italian students who live in Enna, a small city located in Sicily, Italy, aged 16–19 years (197 males and 220 females) in the fourth year (mean =17.2, standard deviation =0.52) and the fifth year (mean =18.2, standard deviation =0.64) of senior secondary school. The research lasted for one school year; the general group of participants consisted of 470 students, and although all participants agreed to be part of the research, there was a dropout rate of 11.28%. They completed the Ego Identity Process Questionnaire to measure their identity development, the Intelligence Structure Test to investigate aptitudes, the Self-Directed Search to value interests, and General Decision Making Style questionnaire to describe their individual decisional style. Results The data showed that high-school performance was positively associated with rational decision-making style and identity diffusion predicted the use of avoidant style. Interests were related to identity exploration; the differentiation of preferences was related to identity

Intelligent data management is the concept of interfacing a user to a database management system with a value added service that will allow a full range of data management operations at a high level of abstraction using human written language. The development of such a system will be based on expert systems and related artificial intelligence technologies, and will allow the capturing of procedural and relational knowledge about data management operations and the support of a user with such knowledge in an on-line, interactive manner. Such a system will have the following capabilities: (1) the ability to construct a model of the users view of the database, based on the query syntax; (2) the ability to transform English queries and commands into database instructions and processes; (3) the ability to use heuristic knowledge to rapidly prune the data space in search processes; and (4) the ability to use an on-line explanation system to allow the user to understand what the system is doing and why it is doing it. Additional information is given in outline form.

Intelligence for making the work of Richards J. Heuer, Jr. on the psychology of intelligence analysis available to a new generation of intelligence... Psychological research into how people go about generating hypoth- eses shows that people are actually rather poor at thinking of all the pos- sibilities.86... generalize from these experiments to conclude that the same biases are prevalent in the Intelligence Community. When psychological experiments

Education policymakers often go astray when they attempt to integrate multiple intelligences theory into schools, according to the originator of the theory, Howard Gardner, and his colleagues. The greatest potential of a multiple intelligences approach to education grows from the concept of a profile of intelligences. Each learner's intelligence…

After brief considerations about intelligence, a comparative study between biologic and artificial intelligence is made. The specialists in Artificial Intelligence found that intelligence is purely a matter of physical symbol manipulation. The enterprise of Artificial Intelligence aims to understand what we might call Brain Intelligence in terms of concepts and techniques of engineering. However the philosophers believed that computer-machine can have syntax, but can never have semantics. In other words, that they can follow rules, such as those of arithmetic or grammar, but not understand what to us are meanings of symbols, such as words. In the present paper it is stressed that brain/mind complex constitutes a monolithic systemic that functions with emergent properties at several levels of hierarchical organization. These hierarchical levels are non-reducible to one another. They are at least three (neuronal, functional, and semantic), and they function within an interactional plan. The brain/mind complex, which transform informations in meanings, deals with problems by means of both logical and non-logical mechanisms; while logic allows the mind to arrange the elements for reasoning, the non-logical mechanisms (fuzzy logic, heuristics, insights) allows the mind to develop strategies to find solutions. The model for construction of the "intelligent machine" is the operating way of the brain/mind complex, which does not always use logical processes. The role of information science in Artificial Intelligence is to search for knowledge itself (virtual knowledge), rather than to simply attempt a logico-mathematical formalization of knowledge.

Application of Six-Sigma methodology and Change Acceleration Process (CAP)/Work Out (WO) tools to track pap smear results in an outpatient clinic in a hospital-based residency-training program. Observational study of impact of changes obtained through application of Six-Sigma principles in clinicprocess with particular attention to prevention of sentinel events. Using cohort analysis and applying Six-Sigma principles to an interactive electronic medical record Soarian workflow engine, we designed a system of timely accession and reporting of pap smear and pathology results. We compared manual processes from January 1, 2007 to February 28, 2008 to automated processes from March 1, 2008 to December 31, 2009. Using the Six-Sigma principles, CAP/WO tools, including "voice of the customer" and team focused approach, no outlier events went untracked. Applying the Soarian workflow engine to track prescribed 7 day turnaround time for completion, we identified 148 pap results in 3,936, 3 non-gynecological results in 15, and 41 surgical results in 246. We applied Six-Sigma principles to an outpatient clinic facilitating an interdisciplinary team approach to improve the clinic's reporting system. Through focused problem assessment, verification of process, and validation of outcomes, we improved patient care for pap smears and critical pathology.

Example: Language identification from audio signals. In a certain mission, a set of languages seems important beforehand. These languages will – with a...tasks to be performed. • OCR: determine the text parts in an image – language dependent approach, quality depends on the language. • Steganography

In a dynamic environment, intelligent agents must be responsive to unanticipated conditions. When such conditions occur, an intelligent agent may have to stop a previously planned and scheduled course of actions and replan, reschedule, start new activities and initiate a new problem solving process to successfully respond to the new conditions. Problems occur when an intelligent agent does not have enough knowledge to properly respond to the new situation. DYNACLIPS is an implementation of a framework for dynamic knowledge exchange among intelligent agents. Each intelligent agent is a CLIPS shell and runs a separate process under SunOS operating system. Intelligent agents can exchange facts, rules, and CLIPS commands at run time. Knowledge exchange among intelligent agents at run times does not effect execution of either sender and receiver intelligent agent. Intelligent agents can keep the knowledge temporarily or permanently. In other words, knowledge exchange among intelligent agents would allow for a form of learning to be accomplished.

Asperger syndrome (AS) patients show heterogeneous intelligence profiles and the validity of short forms for estimating intelligence has rarely been studied in this population. We analyzed the validity of Wechsler Intelligence Scale (WIS) short forms for estimating full-scale intelligence quotient (FSIQ) and assessing intelligence profiles in 29 AS patients. Only the Information and Block Design dyad meets the study criteria. No statistically significant differences were found between dyad scores and FSIQ scores (t(28) = 1.757; p = 0.09). The dyad has a high correlation with FSIQ, good percentage of variance explained (R(2) = 0.591; p < 0.001), and high consistency with the FSIQ classification (χ(2)(36) = 45.202; p = 0.14). Short forms with good predictive accuracy may not be accurate in clinical groups with atypical cognitive profiles such as AS patients.

New discoveries in basic science are creating extraordinary opportunities to design novel biomedical preventions and therapeutics for human disease. But the clinical evaluation of these new interventions is, in many instances, being hindered by a variety of legal, regulatory, policy and operational factors, few of which enhance research quality, the safety of study participants or research ethics. With the goal of helping increase the efficiency and effectiveness of clinical research, we have examined how the integration of utilization-focused evaluation with elements of business process modeling can reveal opportunities for systematic improvements in clinical research. Using data from the NIH global HIV/AIDS clinical trials networks, we analyzed the absolute and relative times required to traverse defined phases associated with specific activities within the clinical protocol lifecycle. Using simple median duration and Kaplan-Meyer survival analysis, we show how such time-based analyses can provide a rationale for the prioritization of research process analysis and re-engineering, as well as a means for statistically assessing the impact of policy modifications, resource utilization, re-engineered processes and best practices. Successfully applied, this approach can help researchers be more efficient in capitalizing on new science to speed the development of improved interventions for human disease.

New discoveries in basic science are creating extraordinary opportunities to design novel biomedical preventions and therapeutics for human disease. But the clinical evaluation of these new interventions is, in many instances, being hindered by a variety of legal, regulatory, policy and operational factors, few of which enhance research quality, the safety of study participants or research ethics. With the goal of helping increase the efficiency and effectiveness of clinical research, we have examined how the integration of utilization-focused evaluation with elements of business process modeling can reveal opportunities for systematic improvements in clinical research. Using data from the NIH global HIV/AIDS clinical trials networks, we analyzed the absolute and relative times required to traverse defined phases associated with specific activities within the clinical protocol lifecycle. Using simple median duration and Kaplan-Meyer survival analysis, we show how such time-based analyses can provide a rationale for the prioritization of research process analysis and re-engineering, as well as a means for statistically assessing the impact of policy modifications, resource utilization, re-engineered processes and best practices. Successfully applied, this approach can help researchers be more efficient in capitalizing on new science to speed the development of improved interventions for human disease. PMID:21552512

The concept of Total Quality Management (TQM) has come to be applied in healthcare over the last few years. The process management category in the Baldrige Health Care Criteria for Performance Excellence model is designed to evaluate the quality of medical services. However, a systematic approach for implementation support is necessary to achieve excellence in the healthcare business process. The Architecture of Integrated Information Systems (ARIS) is a business process architecture developed by IDS Scheer AG and has been applied in a variety of industrial application. It starts with a business strategy to identify the core and support processes, and encompasses the whole life-cycle range, from business process design to information system deployment, which is compatible with the concept of healthcare performance excellence criteria. In this research, we apply the basic ARIS framework to optimize the clinicalprocesses of an emergency department in a mid-size hospital with 300 clinical beds while considering the characteristics of the healthcare organization. Implementation of the case is described, and 16 months of clinical data are then collected, which are used to study the performance and feasibility of the method. The experience gleaned in this case study can be used a reference for mid-size hospitals with similar business models.

PURPOSE: Needle guidance software using augmented reality image overlay was translated from the experimental phase to support preclinical and clinical studies. Major functional and structural changes were needed to meet clinical requirements. We present the process applied to fulfill these requirements, and selected features that may be applied in the translational phase of other image-guided surgical navigation systems. METHODS: We used an agile software development process for rapid adaptation to unforeseen clinical requests. The process is based on iterations of operating room test sessions, feedback discussions, and software development sprints. The open-source application framework of 3D Slicer and the NA-MIC kit provided sufficient flexibility and stable software foundations for this work. RESULTS: All requirements were addressed in a process with 19 operating room test iterations. Most features developed in this phase were related to workflow simplification and operator feedback. CONCLUSION: Efficient and affordable modifications were facilitated by an open source application framework and frequent clinical feedback sessions. Results of cadaver experiments show that software requirements were successfully solved after a limited number of operating room tests.

Analyzes the clinical protocol within the rhetorical framework of the drug development and approval process, identifying the constraints under which the protocol is written and the rhetorical form, argumentative strategies, and style needed to improve and teach the writing of this document. (SC)

of clinical trial patients to select patient groups which are likely to respond to treatment (pharmacogenomics), modelling and simulation of preclinical and clinical trials, integration of pharmacokinetic and pharmacodynamic principles into drug development, assessment of the interaction potential (CYP-450, trasporter proteins and others), increasing use of biomarker/surrogate marker for rapid clinical feedback, involvement of the target population as soon as possible, applying statistical data analysis techniques for proving effectiveness, co-operation with high quality centers. To reach this goal clinical pharmacology must be fully integrated in the whole process from the candidate selection to its positioning within the market.

Humans throughout history have always sought to mimic the appearance, mobility, functionality, intelligent operation, and thinking process of biological creatures. This field of biologically inspired technology, having the moniker biomimetics, has evolved from making static copies of human and animals in the form of statues to the emergence of robots that operate with realistic behavior. Imagine a person walking towards you where suddenly you notice something weird about him--he is not real but rather he is a robot. Your reaction would probably be "I can't believe it but this robot looks very real" just as you would react to an artificial flower that is a good imitation. You may even proceed and touch the robot to check if your assessment is correct but, as oppose to the flower case, the robot may be programmed to respond physical and verbally. This science fiction scenario could become a reality as the current trend continues in developing biologically inspired technologies. Technology evolution led to such fields as artificial muscles, artificial intelligence, and artificial vision as well as biomimetic capabilities in materials science, mechanics, electronics, computing science, information technology and many others. This paper will review the state of the art and challenges to biologically-inspired technologies and the role that EAP is expected to play as the technology evolves.

A description of patient conditions should consist of the changes in and combination of clinical measures. Traditional data-processing method and classification algorithms might cause clinical information to disappear and reduce prediction performance. To improve the accuracy of clinical-outcome prediction by using multiple measurements, a new multiple-time-series data-processing algorithm with period merging is proposed. Clinical data from 83 hepatocellular carcinoma (HCC) patients were used in this research. Their clinical reports from a defined period were merged using the proposed merging algorithm, and statistical measures were also calculated. After data processing, multiple measurements support vector machine (MMSVM) with radial basis function (RBF) kernels was used as a classification method to predict HCC recurrence. A multiple measurements random forest regression (MMRF) was also used as an additional evaluation/classification method. To evaluate the data-merging algorithm, the performance of prediction using processed multiple measurements was compared to prediction using single measurements. The results of recurrence prediction by MMSVM with RBF using multiple measurements and a period of 120 days (accuracy 0.771, balanced accuracy 0.603) were optimal, and their superiority to the results obtained using single measurements was statistically significant (accuracy 0.626, balanced accuracy 0.459, P < 0.01). In the cases of MMRF, the prediction results obtained after applying the proposed merging algorithm were also better than single-measurement results (P < 0.05). The results show that the performance of HCC-recurrence prediction was significantly improved when the proposed data-processing algorithm was used, and that multiple measurements could be of greater value than single.

The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living

The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living

Japan's government-supported fifth-generation computer project has had a pronounced effect on the American computer and information systems industry. The US firms are intensifying their research on and production of intelligent supercomputers, a combination of computer architecture and artificial intelligence software programs. While the present generation of computers is built for the processing of numbers, the new supercomputers will be designed specifically for the solution of symbolic problems and the use of artificial intelligence software. This article discusses new and exciting developments that will increase computer capabilities in the 1990s. 4 references.

Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinicalprocesses and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

The objective of this case report is to evaluate the use of a clinical data warehouse coupled with a clinical information system to test and refine alerts for medication orders control before they were fully implemented. A clinical decision rule refinement process was used to assess alerts. The criteria assessed were the frequencies of alerts for initial prescriptions of 10 medications whose dosage levels depend on renal function thresholds. In the first iteration of the process, the frequency of the 'exceeds maximum daily dose' alerts was 7.10% (617/8692), while that of the 'under dose' alerts was 3.14% (273/8692). Indicators were presented to the experts. During the different iterations of the process, 45 (16.07%) decision rules were removed, 105 (37.5%) were changed and 136 new rules were introduced. Extensive retrospective analysis of physicians' medication orders stored in a clinical data warehouse facilitates alert optimization toward the goal of maximizing the safety of the patient and minimizing overridden alerts.

Collaboration between providers and researchers can be key to doing women's HIV prevention that is holistic, gender sensitive, and responsive to communities. This report centers on providers' and evaluators' experiences in developing and implementing a project promoting "healthy relationships" with low-income women from different ethnicities at an urban American Indian clinic. During planning, decisions on the health problems to be targeted, division of labor, program goals, resource allocation, evaluation design, and outcome measures were jointly made. Other factors were the input of participants and the influence of American Indian values at the clinic. The implementation process was fully collaborative. There are implications for creating conditions for successful collaborations in health education.

Owing to higher performance on the Raven's Progressive Matrices (RPM) than on the Wechsler Intelligence Scales (WIS), it has recently been argued that intelligence is underestimated in autism. This study examined RPM and WIS IQs in 48 individuals with autism, a mixed clinical (n = 28) and a neurotypical (n = 25) control group. Average RPM IQ was…

In order for Business Intelligence to truly move beyond where it is today, a shift in approach must occur. Currently, much of what is accomplished in the realm of Business Intelligence relies on reports and dashboards to summarize and deliver information to end users. As we move into the future, we need to get beyond these reports and dashboards to a point where we break out the individual metrics that are embedded in these reports and interact with these components independently. Breaking these pieces of information out of the confines of reports and dashboards will allow them to be dynamically assembled for delivery in the way that makes most sense to each consumer. With this change in ideology, Business Intelligence will move from the concept of collections of objects, or reports and dashboards, to individual objects, or information components. The Next Generation Business Intelligence suite will translate concepts popularized in Facebook, Flickr, and Digg into enterprise worthy communication vehicles.

There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies.

There is growing appreciation that process improvement holds promise for improving the quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. PMID:26332869

Vital to the success of an expert system is an interface to the user which performs intelligently. A generic intelligent interface is being developed for expert systems. This intelligent interface was developed around the in-house developed Expert System for the Flight Analysis System (ESFAS). The Flight Analysis System (FAS) is comprised of 84 configuration controlled FORTRAN subroutines that are used in the preflight analysis of the space shuttle. In order to use FAS proficiently, a person must be knowledgeable in the areas of flight mechanics, the procedures involved in deploying a certain payload, and an overall understanding of the FAS. ESFAS, still in its developmental stage, is taking into account much of this knowledge. The generic intelligent interface involves the integration of a speech recognizer and synthesizer, a preparser, and a natural language parser to ESFAS. The speech recognizer being used is capable of recognizing 1000 words of connected speech. The natural language parser is a commercial software package which uses caseframe instantiation in processing the streams of words from the speech recognizer or the keyboard. The systems configuration is described along with capabilities and drawbacks.

The motivation behind an advanced technology program to develop intelligent power management and distribution (PMAD) systems is described. The program concentrates on developing digital control and distributed processing algorithms for PMAD components and systems to improve their size, weight, efficiency, and reliability. Specific areas of research in developing intelligent DC-DC converters and distributed switchgear are described. Results from recent development efforts are presented along with expected future benefits to the overall PMAD system performance.

As a part of the overall goal of developing Integrated Vehicle Health Management systems for aerospace vehicles, the NASA Faculty Fellowship Program (NFFP) at Marshall Space Flight Center has performed a pilot study on IVHM principals which integrates researched IVHM technologies in support of Integrated Intelligent Vehicle Management (IIVM). IVHM is the process of assessing, preserving, and restoring system functionality across flight and ground systems (NASA NGLT 2004). The framework presented in this paper integrates advanced computational techniques with sensor and communication technologies for spacecraft that can generate responses through detection, diagnosis, reasoning, and adapt to system faults in support of INM. These real-time responses allow the IIVM to modify the affected vehicle subsystem(s) prior to a catastrophic event. Furthermore, the objective of this pilot program is to develop and integrate technologies which can provide a continuous, intelligent, and adaptive health state of a vehicle and use this information to improve safety and reduce costs of operations. Recent investments in avionics, health management, and controls have been directed towards IIVM. As this concept has matured, it has become clear the INM requires the same sensors and processing capabilities as the real-time avionics functions to support diagnosis of subsystem problems. New sensors have been proposed, in addition, to augment the avionics sensors to support better system monitoring and diagnostics. As the designs have been considered, a synergy has been realized where the real-time avionics can utilize sensors proposed for diagnostics and prognostics to make better real-time decisions in response to detected failures. IIVM provides for a single system allowing modularity of functions and hardware across the vehicle. The framework that supports IIVM consists of 11 major on-board functions necessary to fully manage a space vehicle maintaining crew safety and mission

In April 2012, the National Institutes of Health organized a two-day workshop entitled 'Natural Language Processing: State of the Art, Future Directions and Applications for Enhancing Clinical Decision-Making' (NLP-CDS). This report is a summary of the discussions during the second day of the workshop. Collectively, the workshop presenters and participants emphasized the need for unstructured clinical notes to be included in the decision making workflow and the need for individualized longitudinal data tracking. The workshop also discussed the need to: (1) combine evidence-based literature and patient records with machine-learning and prediction models; (2) provide trusted and reproducible clinical advice; (3) prioritize evidence and test results; and (4) engage healthcare professionals, caregivers, and patients. The overall consensus of the NLP-CDS workshop was that there are promising opportunities for NLP and CDS to deliver cognitive support for healthcare professionals, caregivers, and patients.

When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

Intelligence analysis plays a vital role in policy decision making. Key functions of intelligence analysis include accurately forecasting significant events, appropriately characterizing the uncertainties inherent in such forecasts, and effectively communicating those probabilistic forecasts to stakeholders. We review decision research on probabilistic forecasting and uncertainty communication, drawing attention to findings that could be used to reform intelligenceprocesses and contribute to more effective intelligence oversight. We recommend that the intelligence community (IC) regularly and quantitatively monitor its forecasting accuracy to better understand how well it is achieving its functions. We also recommend that the IC use decision science to improve these functions (namely, forecasting and communication of intelligence estimates made under conditions of uncertainty). In the case of forecasting, decision research offers suggestions for improvement that involve interventions on data (e.g., transforming forecasts to debias them) and behavior (e.g., via selection, training, and effective team structuring). In the case of uncertainty communication, the literature suggests that current intelligence procedures, which emphasize the use of verbal probabilities, are ineffective. The IC should, therefore, leverage research that points to ways in which verbal probability use may be improved as well as exploring the use of numerical probabilities wherever feasible.

Sleep spindles are thalamocortical oscillations in non-rapid eye movement (NREM) sleep, that play an important role in sleep-related neuroplasticity and offline information processing. Several studies with full-night sleep recordings have reported a positive association between sleep spindles and fluid intelligence scores, however more recently it has been shown that only few sleep spindle measures correlate with intelligence in females, and none in males. Sleep spindle regulation underlies a circadian rhythm, however the association between spindles and intelligence has not been investigated in daytime nap sleep so far. In a sample of 86 healthy male human subjects, we investigated the correlation between fluid intelligence and sleep spindle parameters in an afternoon nap of 100 minutes. Mean sleep spindle length, amplitude and density were computed for each subject and for each derivation for both slow and fast spindles. A positive association was found between intelligence and slow spindle duration, but not any other sleep spindle parameter. As a positive correlation between intelligence and slow sleep spindle duration in full-night polysomnography has only been reported in females but not males, our results suggest that the association between intelligence and sleep spindles is more complex than previously assumed.

Background: Intelligibility is a speaker's ability to convey a message to a listener. Including an assessment of intelligibility is essential in both research and clinical work relating to individuals with communication disorders due to speech impairment. Assessment of the intelligibility of spontaneous speech can be used as an overall…

Nursing students, particularly at the time of entering clinical education, experience a great deal of stress and emotion typically related to their educational and clinical competence. Emotional intelligence is known to be one of the required skills to effectively cope with such feelings. The aim of this study was to investigate the effect of training on first-year nursing students' levels of emotional intelligence. This was a quasi-experiment study in which 69 first-year nursing students affiliated with Tehran University of Medical Sciences were assigned to either the control or the experimental groups. The study intervention included of an emotional intelligence educational program offered in eight two-hour sessions for eight subsequent weeks. In total, 66 students completed the study. The study groups did not differ significantly in terms of emotional intelligence scores before and after educational program. Although the educational program did not have an effect on students' emotional intelligence scores, this study finding can be explained. Limited time for exercising the acquired knowledge and skills may explain the non-significant findings. Moreover, our participants were exclusively first-year students who had no clinical experience and hence, might have felt no real need to learn emotional intelligence skills.

We think because we eat. Or as Descartes might have said, on a little more reflection, "I need to eat, therefore I think." Animals that forage for a living repeatedly face the problem of searching for a sparsely distributed resource in a vast space. Furthermore, the resource may occur sporadically and episodically under conditions of true uncertainty (nonstationary, complex and non-linear dynamics). I assert that this problem is the canonical problem solved by intelligence. It's solution is the basis for the evolution of more advanced intelligence in which the space of search includes that of concepts (objects and relations) encoded in cortical structures. In humans the conscious experience of searching through concept space we call thinking. The foraging search model is based upon a higher-order autopoeitic system (the forager) employing anticipatory processing to enhance its success at finding food while avoiding becoming food or having accidents in a hostile world. I present a semi-formal description of the general foraging search problem and an approach to its solution. The latter is a brain-like structure employing dynamically adaptive neurons. A physical robot, MAVRIC, embodies some principles of foraging. It learns cues that lead to improvements in finding targets in a dynamic and nonstationary environment. This capability is based on a unique learning mechanism that encodes causal relations in the neural-like processing element. An argument is advanced that searching for resources in the physical world, as per the foraging model, is a prototype for generalized search for conceptual resources as when we think. A problem represents a conceptual disturbance in a homeostatic sense. The finding of a solution restores the homeostatic balance. The establishment of links between conceptual cues and solutions (resources) and the later use of those cues to think through to solutions of quasi-isomorphic problems is, essentially, foraging for ideas. It is a quite

Objective: To demonstrate how the findings of surface electromyography (S.E.M.G.) were integrated into the clinical decision-making process. Clinical Features: This is a retrospective review of the file of a 27-year-old male suffering from mechanical low back pain. He was evaluated on 3 separate occasions over a 3 year period. History, radiography, functional outcome studies, visual-numerical pain score, pain drawing, physical examination and surface electromyography were utilized in evaluating this patient. Intervention and Outcome: The two clinical interventions of spinal manipulative therapy (S.M.T.) had positive results in that the patient achieved an asymptomatic state and returned to his position of employment. The S.E.M.G. data collected during the industrial assessment, did not provide the outcome that the patient had anticipated. Conclusion: Surface electromyography is a useful clinical tool in the author’s decision-making process for the treatment of mechanical lower back pain. Therapeutic intervention by S.M.T., therapeutic exercises and rating risk factors were influenced by the S.E.M.G. findings.

Due to the progress in genetic research, development and rapid introduction of new genetic tests into clinical practise can be expected. This is raising many ethical issues which need to be carefully considered. First, genetic information is a familial. Thus, the test result of one person may have direct health implications for others who are genetically related. Second, the risks of genetic testing are also psychological, social and financial. Third, due to complex ways of genes interactions, genetic information often has limited predictive power. Finally, many genetic conditions remain difficult to treat or prevent, meaning the value of genetic information may be limited for altering the clinical care for the person. Given these concerns, detailed counselling and informed consent should be key aspects of genetic testing process. Genetic counselling in Czech Republic is provided by clinical geneticist. Therefore he is playing a key role in addressing these issues to patients. His second role is to interpret the genetic information revealed in genetic testing into the language understandable for patient, which means translation of genetic data into diagnosis and clinical management of individual, a transformation from statistics to physical persons. This interpretation is determining many aspects of patient's future life (future planning, reproductive decisions, prevention, health behaviour, etc.) and also family attitudes towards testing. The importance of genetic counselling, informed consent process and precise interpretation of results will be increasing over the time when new generation of genetic technologies for detecting the common conditions will be introduced into the practise.

This paper discussed the development of business intelligence considering the development of data mining. Business intelligence plays an important role in producing up-to-data information for operative and strategic decision-making. We proposed a new kind of knowledge named intelligent knowledge gotten from data. We illustrated a way to combine the business intelligence and intelligent knowledge and proposed a way of the management of intelligent knowledge which is more structural than the traditional knowledge.

New investigations of the foundations of artificial intelligence are challenging the hypothesis that problem solving is the cornerstone of intelligence. New distinctions among three domains of concern for humans--description, action, and commitment--have revealed that the design process for programmable machines, such as expert systems, is based on descriptions of actions and induces blindness to nonanalytic action and commitment. Design processes focusing in the domain of description are likely to yield programs like burearcracies: rigid, obtuse, impersonal, and unable to adapt to changing circumstances. Systems that learn from their past actions, and systems that organize information for interpretation by human experts, are more likely to be successful in areas where expert systems have failed.

This book presents an up-to-date study of the interaction between the fast-growing discipline of artificial intelligence and other human endeavors. The volume explores the scope and limitations of computing, and presents a history of the debate on the possibility of machines achieving intelligence. The authors offer a state-of-the-art survey of Al, concentrating on the ''mind'' (language understanding) and the ''body'' (robotics) of intelligent computing systems.

A new set of 174 pictures in black-and-white, coloured and spatially filtered versions, taken from photographs of real objects belonging to different semantic categories, was realised for experimental and clinical research on visual object processing. Two samples, one of English speakers and one of Italian speakers, were tested in order to provide the normative data for each picture, in both black-and-white and coloured versions, in relation to familiarity, visual complexity and name agreement.

Our multi-disciplinary neurology team were dissatisfied with long access times for consultation for new referrals. We participated in a rapid process improvement workshop and a structured improvement process. Over a six-month period we were able to reduce our access time for initial appointment for patients with suspected movement disorders from 133 to 20 days. We implemented a ‘carousel’ multi-disciplinary appointment and a standardised clinic form that improved the flow of patients and that we estimate will save 150 hours of physician time and 320 hours of administrative time per year. PMID:26734164

Summary Introduction: The prevalence of asthma has grown considerably in recent decades, but some studies have shown stabilization of this trend. The masticatory process of asthmatic children may be altered due to asthma-related anatomo-functional changes. Objective: The study objective was to determine the clinical and electromyographic characteristics of the masticatory process in asthmatic children and compare the electrical activities of their masseter and anterior temporal muscles (at rest and during maximal voluntary contraction and mastication) with those of non-asthmatic children. Method: Case study. Asthmatic and non-asthmatic groups, each consisting of 30 children of both sexes between 6 and 10 years of age, were evaluated. Mastication was evaluated clinically and electromyographically in all subjects. RESULTS: The masticatory process did not differ significantly between asthmatic and non-asthmatic children. Conclusion: Although the masticatory process did not differ significantly between asthmatic and non-asthmatic children, the masticatory process of asthmatic children may be altered because of anatomical changes of Asthma. PMID:25991958

Clinical supervision (CS) has been identified within nursing as a process for improving clinical practice and reducing the emotional burden of nursing practice. Little is known about its implementation across large tertiary referral hospitals. The purpose of this study is to evaluate the implementation of clinical supervision across several different nursing specialities at a teaching hospital in Sydney, Australia. Using a model of nursing implementation science, a process was developed at the study site that facilitated the development, implementation and evaluation of the project. After a 6-month study period, the CS groups were postevaluated using a survey tool developed for the project. A total of nine CS groups were in operation over the 6-month study period. A predominant focus within the sessions was one of the collegial support and developing standards of practice. The process was able to achieve wide hospital-based support for the role of CS from the senior nurse executives to junior nurses. Whilst there was overall positive support for the CS groups, logistical and resource challenges remain, in the effective roll out of CS to large numbers of nurses.

Therapies targeted at fundamental processes of aging may hold great promise for enhancing the health of a wide population by delaying or preventing a range of age-related diseases and conditions-a concept dubbed the "geroscience hypothesis." Early, proof-of-concept clinical trials will be a key step in the translation of therapies emerging from model organism and preclinical studies into clinical practice. This article summarizes the outcomes of an international meeting partly funded through the NIH R24 Geroscience Network, whose purpose was to generate concepts and frameworks for early, proof-of-concept clinical trials for therapeutic interventions that target fundamental processes of aging. The goals of proof-of-concept trials include generating preliminary signals of efficacy in an aging-related disease or outcome that will reduce the risk of conducting larger trials, contributing data and biological samples to support larger-scale research by strategic networks, and furthering a dialogue with regulatory agencies on appropriate registration indications. We describe three frameworks for proof-of-concept trials that target age-related chronic diseases, geriatric syndromes, or resilience to stressors. We propose strategic infrastructure and shared resources that could accelerate development of therapies that target fundamental aging processes.

Therapies targeted at fundamental processes of aging may hold great promise for enhancing the health of a wide population by delaying or preventing a range of age-related diseases and conditions—a concept dubbed the “geroscience hypothesis.” Early, proof-of-concept clinical trials will be a key step in the translation of therapies emerging from model organism and preclinical studies into clinical practice. This article summarizes the outcomes of an international meeting partly funded through the NIH R24 Geroscience Network, whose purpose was to generate concepts and frameworks for early, proof-of-concept clinical trials for therapeutic interventions that target fundamental processes of aging. The goals of proof-of-concept trials include generating preliminary signals of efficacy in an aging-related disease or outcome that will reduce the risk of conducting larger trials, contributing data and biological samples to support larger-scale research by strategic networks, and furthering a dialogue with regulatory agencies on appropriate registration indications. We describe three frameworks for proof-of-concept trials that target age-related chronic diseases, geriatric syndromes, or resilience to stressors. We propose strategic infrastructure and shared resources that could accelerate development of therapies that target fundamental aging processes. PMID:27535966

Implications of current understandings of the nature of human intelligence for the possibility of extraterrestrial intelligence are discussed. The perceptual theory of intelligence as the manipulation of perceptual images rather than language is introduced, and conditions leading to the ascendancy of man over other hominids with similar conceptual abilities are discussed, including the liberation of the hands from a locomotive function and the evolution of neoteny. It is argued that the specificity of the environmental, behavioral and physiological conditions which lead to the emergence of technologically oriented, and communicative intelligent creatures suggests that any SETI would most likely be fruitless.

Demographers debate why people have children in advanced industrial societies where children are net economic costs. From an evolutionary perspective, however, the important question is why some individuals choose not to have children. Recent theoretical developments in evolutionary psychology suggest that more intelligent individuals may be more likely to prefer to remain childless than less intelligent individuals. Analyses of the National Child Development Study show that more intelligent men and women express preference to remain childless early in their reproductive careers, but only more intelligent women (not more intelligent men) are more likely to remain childless by the end of their reproductive careers. Controlling for education and earnings does not at all attenuate the association between childhood general intelligence and lifetime childlessness among women. One-standard-deviation increase in childhood general intelligence (15 IQ points) decreases women's odds of parenthood by 21-25%. Because women have a greater impact on the average intelligence of future generations, the dysgenic fertility among women is predicted to lead to a decline in the average intelligence of the population in advanced industrial nations.

The need for intelligent sensors as a critical component for Integrated System Health Management (ISHM) is fairly well recognized by now. Even the definition of what constitutes an intelligent sensor (or smart sensor) is well documented and stems from an intuitive desire to get the best quality measurement data that forms the basis of any complex health monitoring and/or management system. If the sensors, i.e. the elements closest to the measurand, are unreliable then the whole system works with a tremendous handicap. Hence, there has always been a desire to distribute intelligence down to the sensor level, and give it the ability to assess its own health thereby improving the confidence in the quality of the data at all times. This paper proposes the development of intelligent sensors as an integrated systems approach, i.e. one treats the sensors as a complete system with its own sensing hardware (the traditional sensor), A/D converters, processing and storage capabilities, software drivers, self-assessment algorithms, communication protocols and evolutionary methodologies that allow them to get better with time. Under a project being undertaken at the NASA Stennis Space Center, an integrated framework is being developed for the intelligent monitoring of smart elements. These smart elements can be sensors, actuators or other devices. The immediate application is the monitoring of the rocket test stands, but the technology should be generally applicable to the Intelligent Systems Health Monitoring (ISHM) vision. This paper outlines some fundamental issues in the development of intelligent sensors under the following two categories: Physical Intelligent Sensors (PIS) and Virtual Intelligent Sensors (VIS).

... of the Secretary Defense Intelligence Agency National Intelligence University Board of Visitors Closed Meeting AGENCY: Department of Defense, Defense Intelligence Agency, National Intelligence... a closed meeting of the Defense Intelligence Agency National Intelligence University Board...

Cognitive neuroscience has made considerable progress in understanding the neural architecture of human intelligence, identifying a broadly distributed network of frontal and parietal regions that support goal-directed, intelligent behavior. However, the contributions of this network to social and emotional aspects of intellectual function remain to be well characterized. Here we investigated the neural basis of emotional intelligence in 152 patients with focal brain injuries using voxel-based lesion-symptom mapping. Latent variable modeling was applied to obtain measures of emotional intelligence, general intelligence and personality from the Mayer, Salovey, Caruso Emotional Intelligence Test (MSCEIT), the Wechsler Adult Intelligence Scale and the Neuroticism-Extroversion-Openness Inventory, respectively. Regression analyses revealed that latent scores for measures of general intelligence and personality reliably predicted latent scores for emotional intelligence. Lesion mapping results further indicated that these convergent processes depend on a shared network of frontal, temporal and parietal brain regions. The results support an integrative framework for understanding the architecture of executive, social and emotional processes and make specific recommendations for the interpretation and application of the MSCEIT to the study of emotional intelligence in health and disease.

This study examined the influence of psychopathy and intelligence on malingering in a simulated malingering design. We hypothesized that participants high in both traits would be more adept at evading detection on performance validity tests (PVTs). College students (N = 92) were first administered the Wechsler Test of Adult Reading, a reading measure that estimates intelligence, and the Psychopathic Personality Inventory-Short Form under standard conditions. They were then asked to imagine as if they had suffered a concussion a year ago and were instructed to fake or exaggerate symptoms in a believable fashion to improve their settlement as part of a lawsuit. Participants were subsequently administered a brief neuropsychological battery that included the Word Memory Test, Rey 15-Item Test with Recognition, Finger-Tapping Test, and Digit Span from the Wechsler Adult Intelligence Scale-Fourth Edition. Moderated multiple regressions with hierarchical entry were conducted. Intelligence, psychopathy, and the interaction of intelligence and psychopathy were not related to performance on any of the PVTs. In other words, participants who scored higher on intelligence and psychopathy did not perform differently on these measures compared with other participants. Though a null finding, implications of this study are discussed in terms of the broader research and clinical literature on malingering.

Today, many would assume that Charles Darwin absolutely rejected any claim of intelligent design in nature. However, review of his initial writings reveals that Darwin accepted some aspects of this view. His conceptualization of design was founded on both the cosmological and the teleological ideas from classical natural theology. When Darwin discovered the dynamic process of natural selection, he rejected the old teleological argument as formulated by William Paley. However, he was never able to ignore the powerful experience of the beauty and complexity of an intelligently designed universe, as a whole. He corresponded with Asa Gray on religious themes, particularly touching the problem of pain and intelligent design in nature. The term "intelligent design" was probably introduced by William Whewell. Principally for theological and philosophical reasons, Darwin could only accept the concept for the universe as a whole, not with respect to individual elements of the living world.

The Test of Emotional Intelligence (TIE) is a new ability scale based on a theoretical model that defines emotional intelligence as a set of skills responsible for the processing of emotion-relevant information. Participants are provided with descriptions of emotional problems, and asked to indicate which emotion is most probable in a given situation, or to suggest the most appropriate action. Scoring is based on the judgments of experts: professional psychotherapists, trainers, and HR specialists. The validation study showed that the TIE is a reliable and valid test, suitable for both scientific research and individual assessment. Its internal consistency measures were as high as .88. In line with theoretical model of emotional intelligence, the results of the TIE shared about 10% of common variance with a general intelligence test, and were independent of major personality dimensions.

The Test of Emotional Intelligence (TIE) is a new ability scale based on a theoretical model that defines emotional intelligence as a set of skills responsible for the processing of emotion-relevant information. Participants are provided with descriptions of emotional problems, and asked to indicate which emotion is most probable in a given situation, or to suggest the most appropriate action. Scoring is based on the judgments of experts: professional psychotherapists, trainers, and HR specialists. The validation study showed that the TIE is a reliable and valid test, suitable for both scientific research and individual assessment. Its internal consistency measures were as high as .88. In line with theoretical model of emotional intelligence, the results of the TIE shared about 10% of common variance with a general intelligence test, and were independent of major personality dimensions. PMID:25072656

Provides expertise, and plans, conducts and directs research and engineering development in the competency fields of advanced communications and intelligent systems technologies for applications in current and future aeronautics and space systems.Advances communication systems engineering, development and analysis needed for Glenn Research Center's leadership in communications and intelligent systems technology. Focus areas include advanced high frequency devices, components, and antennas; optical communications, health monitoring and instrumentation; digital signal processing for communications and navigation, and cognitive radios; network architectures, protocols, standards and network-based applications; intelligent controls, dynamics and diagnostics; and smart micro- and nano-sensors and harsh environment electronics. Research and discipline engineering allow for the creation of innovative concepts and designs for aerospace communication systems with reduced size and weight, increased functionality and intelligence. Performs proof-of-concept studies and analyses to assess the impact of the new technologies.

The hypothesis that performance on implicit learning tasks is unrelated to psychometric intelligence was examined in a sample of 605 German pupils. Performance in artificial grammar learning, process control, and serial learning did not correlate with various measures of intelligence when participants were given standard implicit instructions. Under an explicit rule discovery instruction, however, a significant relationship between performance on the learning tasks and intelligence appeared. This finding provides support for Reber's hypothesis that implicit learning, in contrast to explicit learning, is independent of intelligence, and confirms thereby the distinction between the 2 modes of learning. However, because there were virtually no correlations among the 3 learning tasks, the assumption of a unitary ability of implicit learning was not supported.

To be fully effective, combat simulation must include an intelligently interactive enemy... one that can be calibrated. But human operated combat simulations are uncalibratable, for we learn during the engagement, there's no average enemy, and we cannot replicate their culture/personality. Rule-based combat simulations (expert systems) are not interactive. They do not take advantage of unexpected mistakes, learn, innovate, and reflect the changing mission/situation. And it is presumed that the enemy does not have a copy of the rules, that the available experts are good enough, that they know why they did what they did, that their combat experience provides a sufficient sample and that we know how to combine the rules offered by differing experts. Indeed, expert systems become increasingly complex, costly to develop, and brittle. They have face validity but may be misleading. In contrast, intelligently interactive combat simulation is purpose- driven. Each player is given a well-defined mission, reference to the available weapons/platforms, their dynamics, and the sensed environment. Optimal tactics are discovered online and in real-time by simulating phenotypic evolution in fast time. The initial behaviors are generated randomly or include hints. The process then learns without instruction. The Valuated State Space Approach provides a convenient way to represent any purpose/mission. Evolutionary programming searches the domain of possible tactics in a highly efficient manner. Coupled together, these provide a basis for cruise missile mission planning, and for driving tank warfare simulation. This approach is now being explored to benefit Air Force simulations by a shell that can enhance the original simulation.

The papers presented at the 1990 Goddard Conference on Space Applications of Artificial Intelligence are given. The purpose of this annual conference is to provide a forum in which current research and development directed at space applications of artificial intelligence can be presented and discussed. The proceedings fall into the following areas: Planning and Scheduling, Fault Monitoring/Diagnosis, Image Processing and Machine Vision, Robotics/Intelligent Control, Development Methodologies, Information Management, and Knowledge Acquisition.

NASA has initiated the Intelligent Data Management Project to design and develop advanced information management systems. The project's primary goal is to formulate, design and develop advanced information systems that are capable of supporting the agency's future space research and operational information management needs. The first effort of the project was the development of a prototype Intelligent User Interface to an operational scientific database, using expert systems and natural language processing technologies. An overview of Intelligent User Interface formulation and development is given.

Process, by modeling intelligence as an opportunistic, multi-source, multi-entity system of systems . The value of intelligence fusion is compared...network model , agencies would still have their own databases, but those databases would be searchable across agency lines. In this system , levels of...effort is to accurately model the flow of intelligence information through a multi-INT system and provide an output measure of total 1-3

Traditional Chinese medicine (TCM) is a very practical subject, which has its unique theoretical system and clinical characteristics. In the course of clinical practice, the exact clinical efficacy is the key of existence and development. But the existing evaluation system is difficult to objectively evaluate the clinical efficacy of TCM. Therefore, how to objectively evaluate the clinical efficacy and get definitive evidence is the focus of the evaluation of clinical efficacy of TCM. Relative to modern medicine, TCM is more concerned about the changes of feelings and clinical symptoms of the patient in the course of the evolution of the disease. Soft targets mainly used for the evaluation of the clinical efficacy of symptoms and functional activity of the disease. The level in soft targets of processing technology is often used methods in clinical evaluation. But it has often produced the phenomenon which the results of the evaluation is mutual contradiction, which will ultimately affect the effect of evaluation of clinical efficacy of TCM. In order to better evaluate the clinical efficacy of TCM, in the process of adoption of soft targets, it clearly identify it's role, highlighting the characteristics of interventions on disease, and as much as possibly avoid the level in soft targets of processing technology to real assess clinical efficacy of TCM.

Recently, there has been a problem of shortage of sleep laboratories that can accommodate the patients in a timely manner. Delayed diagnosis and treatment may lead to worse outcomes particularly in patients with severe obstructive sleep apnea (OSA). For this reason, the prioritization in polysomnography (PSG) queueing should be endorsed based on disease severity. To date, there have been conflicting data whether clinical information can predict OSA severity. The 1,042 suspected OSA patients underwent diagnostic PSG study at Siriraj Sleep Center during 2010-2011. A total of 113 variables were obtained from sleep questionnaires and anthropometric measurements. The 19 groups of clinical risk factors consisting of 42 variables were categorized into each OSA severity. This study aimed to array these factors by employing Fuzzy Analytic Hierarchy Process approach based on normalized weight vector. The results revealed that the first rank of clinical risk factors in Severe, Moderate, Mild, and No OSA was nighttime symptoms. The overall sensitivity/specificity of the approach to these groups was 92.32%/91.76%, 89.52%/88.18%, 91.08%/84.58%, and 96.49%/81.23%, respectively. We propose that the urgent PSG appointment should include clinical risk factors of Severe OSA group. In addition, the screening for Mild from No OSA patients in sleep center setting using symptoms during sleep is also recommended (sensitivity = 87.12% and specificity = 72.22%).

A review of published work in clinical natural language processing (NLP) may suggest that the negation detection task has been "solved." This work proposes that an optimizable solution does not equal a generalizable solution. We introduce a new machine learning-based Polarity Module for detecting negation in clinical text, and extensively compare its performance across domains. Using four manually annotated corpora of clinical text, we show that negation detection performance suffers when there is no in-domain development (for manual methods) or training data (for machine learning-based methods). Various factors (e.g., annotation guidelines, named entity characteristics, the amount of data, and lexical and syntactic context) play a role in making generalizability difficult, but none completely explains the phenomenon. Furthermore, generalizability remains challenging because it is unclear whether to use a single source for accurate data, combine all sources into a single model, or apply domain adaptation methods. The most reliable means to improve negation detection is to manually annotate in-domain training data (or, perhaps, manually modify rules); this is a strategy for optimizing performance, rather than generalizing it. These results suggest a direction for future work in domain-adaptive and task-adaptive methods for clinical NLP.