Sample records for multi access systems

Based on the analysis of the requirements of conversation history storage in CPM (Converged IP Messaging) system, a Multi-views storage model and access methods of conversation history are proposed. The storage model separates logical views from physical storage and divides the storage into system managed region and user managed region. It simultaneously supports conversation view, system pre-defined view and user-defined view of storage. The rationality and feasibility of multi-view presentation, the physical storage model and access methods are validated through the implemented prototype. It proves that, this proposal has good scalability, which will help to optimize the physical data storage structure and improve storage performance.

Full Text Available Metro systems act as fast and efficient transport systems for many modern metropolises; however, enhancing higher usage of such systems often conflicts with providing suitable accessibility options. The traditional approach of metro accessibility studies seems to be an ineffective measure to gage sustainable access in which the equal rights of all users are taken into account. Bangkok Metropolitan Region (BMR transportation has increasingly relied on the role of two mass rapid transport systems publicly called “BTS Skytrain” and “MRT Subway”, due to limited availability of land and massive road congestion; however, access to such transit arguably treats some vulnerable groups, especially women, the elderly and disabled people unfairly. This study constructs a multi-dimensional assessment of accessibility considerations to scrutinize how user groups access metro services based on BMR empirical case. 600 individual passengers at various stations were asked to rate the questionnaire that simultaneously considers accessibility aspects of spatial, feeder connectivity, temporal, comfort/safety, psychosocial and other dimensions. It was interestingly found by user disaggregated accessibility model that the lower the accessibility perceptions—related uncomfortable and unsafe environment conditions, the greater the equitable access to services, as illustrated by MRT — Hua Lumphong and MRT — Petchaburi stations. The study suggests that, to balance the access priorities of groups on services, policy actions should emphasize acceptably safe access for individuals, cost efficient feeder services connecting the metro lines, socioeconomic influences and time allocation. Insightful discussions on integrated approach balancing different dimensions of accessibility and recommendations would contribute to accessibility-based knowledge and potential propensity to use the public transits towards transport sustainability.

The IO performance of storage devices has accelerated from hundreds of IOPS five years ago, to hundreds of thousands of IOPS today, and tens of millions of IOPS projected in five years. This sharp evolution is primarily due to the introduc- tion of NAND-flash devices and their data parallel desig...... generation block layer that is capable of handling tens of millions of IOPS on a multi-core system equipped with a single storage device. Our experiments show that our design scales graciously with the number of cores, even on NUMA systems with multiple sockets....

The concept of multi-radio access networks is currently considered as a strong candidate for the next generation wireless access networks. This concept incorporates conventional cellular networks such as GERAN or UTRAN for wide area coverage while local-area networks such as IEEE 802.11, HIPERLAN/2,

In many of the problems that can be found nowadays, information is scattered across different heterogeneous data sources. Most of the natural language interfaces just focus on a very specific part of the problem (e.g. an interface to a relational database, or an interface to an ontology). However, from the point of view of users, it does not matter where the information is stored, they just want to get the knowledge in an integrated, transparent, efficient, effective, and pleasant way. To solve this problem, this article proposes a generic multi-agent conversational architecture that follows the divide and conquer philosophy and considers two different types of agents. Expert agents are specialized in accessing different knowledge sources, and decision agents coordinate them to provide a coherent final answer to the user. This architecture has been used to design and implement SmartSeller, a specific system which includes a Virtual Assistant to answer general questions and a Bookseller to query a book database. A deep analysis regarding other relevant systems has demonstrated that our proposal provides several improvements at some key features presented along the paper.

In many of the problems that can be found nowadays, information is scattered across different heterogeneous data sources. Most of the natural language interfaces just focus on a very specific part of the problem (e.g. an interface to a relational database, or an interface to an ontology). However, from the point of view of users, it does not matter where the information is stored, they just want to get the knowledge in an integrated, transparent, efficient, effective, and pleasant way. To solve this problem, this article proposes a generic multi-agent conversational architecture that follows the divide and conquer philosophy and considers two different types of agents. Expert agents are specialized in accessing different knowledge sources, and decision agents coordinate them to provide a coherent final answer to the user. This architecture has been used to design and implement SmartSeller, a specific system which includes a Virtual Assistant to answer general questions and a Bookseller to query a book database. A deep analysis regarding other relevant systems has demonstrated that our proposal provides several improvements at some key features presented along the paper.

Bringing radio access points closer to the end-users improves radio energy efficiency. However, taking into account both the radio and the optical parts of a fibre-wireless accesssystem, the overall system energy efficiency has an upper bound determined by the relation between the energy

Bringing radio access points closer to the end-users improves radio energy efficiency. However, taking into account both the radio and the optical parts of a fibre-wireless accesssystem, the overall system energy efficiency has an upper bound determined by the relation between the energy...

Full Text Available This paper is about the design, implementation, and deployment of a multi-modal biometric system to grant access to a company structure and to internal zones in the company itself. Face and iris have been chosen as biometric traits. Face is feasible for non-intrusive checking with a minimum cooperation from the subject, while iris supports very accurate recognition procedure at a higher grade of invasivity. The recognition of the face trait is based on the Local Binary Patterns histograms, and the Daughman’s method is implemented for the analysis of the iris data. The recognition process may require either the acquisition of the user’s face only or the serial acquisition of both the user’s face and iris, depending on the confidence level of the decision with respect to the set of security levels and requirements, stated in a formal way in the Service Level Agreement at a negotiation phase. The quality of the decision depends on the setting of proper different thresholds in the decision modules for the two biometric traits. Any time the quality of the decision is not good enough, the system activates proper rules, which ask for new acquisitions (and decisions, possibly with different threshold values, resulting in a system not with a fixed and predefined behaviour, but one which complies with the actual acquisition context. Rules are formalized as deduction rules and grouped together to represent “response behaviors” according to the previous analysis. Therefore, there are different possible working flows, since the actual response of the recognition process depends on the output of the decision making modules that compose the system. Finally, the deployment phase is described, together with the results from the testing, based on the AT&T Face Database and the UBIRIS database.

The Seamless Integrated Data Pipeline system was proposed to the European Union in order to overcome the information quality shortcomings of the current international supply chain information exchange systems. Next to identification and authorization of stakeholders, secure access control needs to

Full Text Available Poor utilization of an electromagnetic spectrum and ever increasing demand for spectrum have led to surge of interests in opportunistic spectrum access (OSA based paradigms like cognitive radio and unlicensed LTE. In OSA for decentralized networks, frequency band selection from wideband spectrum is a challenging task since secondary users (SUs do not share any information with each other. In this paper, a new decision making policy (DMP has been proposed for OSA in the multi-user decentralized networks. First contribution is an accurate characterization of frequency bands using Bayes-UCB algorithm. Then, a novel SU orthogonization scheme using Bayes-UCB algorithm is proposed replacing randomization based scheme. At the end, USRP testbed has been developed for analyzing the performance of DMPs using real radio signals. Experimental results show that the proposed DMP offers significant improvement in spectrum utilization, fewer subband switching and collisions compared to other DMPs.

With the number of cores increasing, there is an emerging need for a high-bandwidth low-latency interconnection network, serving core-to-memory communication. In this paper, aiming at the goal of simultaneous access to multi-rank memory, we propose an optical interconnection network for core-to-memory communication. In the proposed network, the wavelength usage is delicately arranged so that cores can communicate with different ranks at the same time and broadcast for flow control can be achieved. A distributed memory controller architecture that works in a pipeline mode is also designed for efficient optical communication and transaction address processes. The scaling method and wavelength assignment for the proposed network are investigated. Compared with traditional electronic bus-based core-to-memory communication, the simulation results based on the PARSEC benchmark show that the bandwidth enhancement and latency reduction are apparent.

Full Text Available In this paper the advanced monitoring system of multiple environmental parameters is presented. The purpose of the system is a long-term estimation of energy efficiency and sustainability for the research test stands which are made of different building materials. Construction of test stands, and placement of main sensors are presented in the first chapter. The structure of data acquisition system includes a real-time interface with sensors and a data logger that allows to acquire and log data from all sensors with fixed rate. The data logging system provides a remote access to the processing of the acquired data and carries out periodical saving at a remote FTP server using an Internet connection. The system architecture and the usage of sensors are explained in the second chapter. In the third chapter implementation of the system, different interfaces of sensors and energy measuring devices are discussed and several examples of data logger program are presented. Each data logger is reading data from analog and digital channels. Measurements can be displayed directly on a screen using WEB access or using data from FTP server. Measurements and acquired data graphical results are presented in the fourth chapter in the selected diagrams. The benefits of the developed system are presented in the conclusion.

Full Text Available Getting the full electronic medical record of a patient is an important step in providing a quality medical service. But the degree of heterogeneity of data from health unit informational systems is very high, because each unit can have a different model for storing patients’ medical data. In order to achieve the interoperability and integration of data from various medical units that store partial patient medical information, this paper proposes a multi-agent systems and ontology based approach. Therefore, we present an ontological model for describing the particular structure of the data integration process. The system is to be used for centralizing the information from a patient’s partial medical records. The main advantage of the proposed model is the low ratio between the complexity of the model and the amount of information that can be retrieved in order to generate the complete medical history of a patient.

Full Text Available The paper discusses the nature of Knowledge Organization Systems (KOSs and shows how these can support digital library users. It demonstrates processes related to integration of KOS like the Dewey Decimal Classification, 22nd edition (DDC22 in DSpace software (http://www.dspace.org/ for organizing and retrieving (browsing and searching scholarly objects. An attempt has been made to use the DDC22 available in Bengali language and highlights the required mechanisms for system-level integration. It may help repository administrator to build IDR (Institutional Digital Repository integrated with SKOS-enabled multilingual subject accesssystems for supporting subject descriptors based indexing (DC.Subject metadata element, structured navigation (browsing and efficient searching.

As more and more applications and services in our society now depend on the Internet, it is important that dynamically deployed wireless multi hop networks are able to gain access to the Internet and other infrastructure networks and services. This thesis proposes and evaluates solutions for providing multi hop Internet Access. It investigates how ad hoc networks can be combined with wireless and mesh networks in order to create wireless multi hop access networks. When several access points t...

A new two dimensional codes family, namely two dimensional multi-diagonal (2D-MD) codes, is proposed for spectral/spatial non-coherent OCDMA systems based on the one dimensional MD code. Since the MD code has the property of zero cross correlation, the proposed 2D-MD code also has this property. So that, the multi-access interference (MAI) is fully eliminated and the phase induced intensity noise (PIIN) is suppressed with the proposed code. Code performance is analyzed in terms of bit error rate (BER) while considering the effect of shot noise, PIIN, and thermal noise. The performance of the proposed code is compared with the related MD, modified quadratic congruence (MQC), two dimensional perfect difference (2D-PD) and two dimensional diluted perfect difference (2D-DPD) codes. The analytical and the simulation results reveal that the proposed 2D-MD code outperforms the other codes. Moreover, a large number of simultaneous users can be accommodated at low BER and high data rate.

Petascale systems are in existence today and will become common in the next few years. Such systems are inevitably very complex, highly distributed and heterogeneous. Monitoring a petascale system in real-time and understanding its status at any given moment without impacting its performance is a highly intricate task. Common approaches and off-the-shelf tools are either unusable, do not scale, or severely impact the performance of the monitored servers. This paper describes unobtrusive monitoring software developed at Stanford Linear Accelerator Center (SLAC) for a highly distributed petascale production data set. The paper describes the employed solutions, the lessons learned, the problems still to be addressed, and explains how the system can be reused elsewhere.

Multi-access protocol is one of the commonlyapplied access control protocols,in which commonchannels is shared by multi-users(as shown inFig.1).In recent years,this protocol has been suc-cessfully applied to various communication sys-tems[1].Typical examples are satellite communica-tion system,mobile communication system,localarea net work(LAN)and metropolitan area net work(MAN).There are chiefly three kinds of Multi-ac-cess channel models,i.e.fixed allocation model,self-adjusting allocation model and rando...

After having presented the initial characteristics and weaknesses of the software provided for the control of a memory disk coupled with a Multi 8 computer, the author reports the development and improvement of this controller software. He presents the different constitutive parts of the computer and the operation of the disk coupling and of the direct access to memory. He reports the development of the disk access controller: software organisation, loader, subprograms and statements

The joint use of opportunistic scheduling and orthogonal frequency division multiple access (OFDMA) provide significant gains in environments of low mobility and scatter for which channel variations are low. The downside of opportunistic scheduling in multicarrier systems such as OFDMA, lies in the substantial uplink overhead required to feed back by the mobile stations (MSs) describing users' instantaneous link conditions. This study presents a novel approach towards multicarrier opportunist...

An automated method for the control and monitoring of personnel movement throughout the site was developed under contract to the Department of Energy by Allied-General Nuclear Services (AGNS) at the Barnwell Nuclear Fuel Plant (BNFP). These automated features provide strict enforcement of personnel access policy without routine patrol officer involvement. Identification methods include identification by employee ID number, identification by voice verification and identification by physical security officer verification. The ability to grant each level of access authority is distributed over the organization to prevent any single individual at any level in the organization from being capable of issuing an authorization for entry into sensitive areas. Each access event is recorded. As access events occur, the inventory of both the entered and the exited control area is updated so that a current inventory is always available for display. The system has been operated since 1979 in a development mode and many revisions have been implemented in hardware and software as areas were added to the system. Recent changes have involved the installation of backup systems and other features required to achieve a high reliability. The access control system and recent operating experience are described

The energy supply in the countries, which have abundant energy resources, may not be affected by accepting the assertion of anti-nuclear and environment groups. Anti-nuclear movements in the countries which have little energy resources may cause serious problem in securing energy supply. Especially, it is distinct in Korea because she heavily depends on nuclear energy in electricity supply(nuclear share in total electricity supply is about 40%).The cause of social trouble surrounding nuclear energy is being involved with various circumstances. However, it is very important that we are not aware of the importance of information access and prepared for such a situation from the early stage of nuclear energy's development. In those matter, this paper analyzes the contents of nuclear information accesssystem in France and Japan which have dynamic nuclear development program and presents the direction of the nuclear access regime through comparing Korean status and referring to progresses of the regime

Full Text Available Intelligent Transport Systems (ITS require widely spread and guarantied quality communications services. Method of ITS decomposition to set of subsystems and quantification of communications subsystems parameters is introduced. Due to typical complexity of the IST solution and mobility as the typical system elements property idea of communications systems with multipath multivendor structures is adopted. Resolution of seamless switching within a set of available wireless access solutions is presented. CALM based system or specifically designed and configured L3/L2 switching can be relevant solution for multi-path access communication system. These systems meet requirements of the seamless secure communications functionality within even extensive cluster of moving objects. Competent decision processes based on precisely quantified system requirements and each performance indicator tolerance range must be implemented to keep service up and running with no influence of continuously changing conditions in time and served space. Method of different paths service quality evaluation and selection of the best possible active communications access path is introduced. Proposed approach is based on Kalman filtering, which separates reasonable part of noise and also allows prediction of the individual parameters near future behavior. Presented classification algorithm applied on filtered measured data combined with deterministic parameters is trained using training data, i.e. combination of parameters vectors line and relevant decisions. Quality of classification is dependent on the size and quality of the training sets. This method is studied within projects e-Ident, DOTEK and SRATVU which are elaborating results of project CAMNA.

As part of the CUAHSI HIS (Consortium of Universities for the Advancement of Hydrologic Science, Inc., Hydrologic Information System), the CUAHSI HIS team has developed Data AccessSystem for Hydrology or DASH. DASH is based on commercial off the shelf technology, which has been developed in conjunction with a commercial partner, ESRI. DASH is a web-based user interface, developed in ASP.NET developed using ESRI ArcGIS Server 9.2 that represents a mapping, querying and data retrieval interface over observation and GIS databases, and web services. This is the front end application for the CUAHSI Hydrologic Information System Server. The HIS Server is a software stack that organizes observation databases, geographic data layers, data importing and management tools, and online user interfaces such as the DASH application, into a flexible multi- tier application for serving both national-level and locally-maintained observation data. The user interface of the DASH web application allows online users to query observation networks by location and attributes, selecting stations in a user-specified area where a particular variable was measured during a given time interval. Once one or more stations and variables are selected, the user can retrieve and download the observation data for further off-line analysis. The DASH application is highly configurable. The mapping interface can be configured to display map services from multiple sources in multiple formats, including ArcGIS Server, ArcIMS, and WMS. The observation network data is configured in an XML file where you specify the network's web service location and its corresponding map layer. Upon initial deployment, two national level observation networks (USGS NWIS daily values and USGS NWIS Instantaneous values) are already pre-configured. There is also an optional login page which can be used to restrict access as well as providing a alternative to immediate downloads. For large request, users would be notified via

A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

An accesssystem based on the one now in operation at the CERN ISR is recommended. Access doors would presumably be located at the entrances to the utility tunnels connecting the support buildings with the ring. Persons requesting access would insert an identity card into a scanner to activate the system. The request would be autologged, the keybank adjacent to the door would be unlocked and ISABELLE operations would be notified. The operator would then select the door, activating a TV-audio link. The person requesting entry would draw a key from the bank, show it and his film badge to the operator who would enable the door release

A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

The aim of the current study was to develop a scale of gambling accessibility that would have theoretical significance to exposure theory and also serve to highlight the accessibility risk factors for problem gambling. Scale items were generated from the Productivity Commission's (Australia's Gambling Industries: Report No. 10. AusInfo, Canberra, 1999) recommendations and tested on a group with high exposure to the gambling environment. In total, 533 gaming venue employees (aged 18-70 years; 67% women) completed a questionnaire that included six 13-item scales measuring accessibility across a range of gambling forms (gaming machines, keno, casino table games, lotteries, horse and dog racing, sports betting). Also included in the questionnaire was the Problem Gambling Severity Index (PGSI) along with measures of gambling frequency and expenditure. Principal components analysis indicated that a common three factor structure existed across all forms of gambling and these were labelled social accessibility, physical accessibility and cognitive accessibility. However, convergent validity was not demonstrated with inconsistent correlations between each subscale and measures of gambling behaviour. These results are discussed in light of exposure theory and the further development of a multi-dimensional measure of gambling accessibility.

Background: There is evidence to suggest that the prevalence of non-communicable diseases (NCDs), in particular cardiovascular diseases and diabetes, are being recognized as forming a substantial proportion of the burden of disease among populations in Low- and Middle-Income Countries (LMICs). Access to treatment is likely a key barrier to the control and prevention of NCD outcomes. Differential pricing, an approach used to price drugs based on the purchasing power of patients in different socioeconomic segments, has been shown to be beneficial and leads to improved access and affordability. Methods: This is a quasi-experimental study, with a pragmatic trial design, to be conducted over the course of three years. A mixed methods design will be used to evaluate the effects of health systems strengthening and differential pricing on the management of diabetes, hypertension and selected cancers in Ghana. A public private partnership was established between all sites that will receive multi-level interventions, including health systems strengthening and access to medicines interventions. Study populations and sites: Study participants will include individuals with new or recently diagnosed hypertension and diabetes (n=3,300), who present to two major referral hospitals, Komfo Anokye Teaching Hospital and Tamale Teaching Hospital, as well as three district hospitals, namely Kings Medical Centre, Agogo Presbyterian District Hospital, and Atua Government Hospital. Discussion: The objective of this study aims to test approaches intended to improve access to drugs for the treatment of hypertension and diabetes, and improve disease control. Patients with these conditions will benefit from health systems strengthening interventions (education, counseling, improved management of disease), and increased access to innovative medicines via differential pricing. Pilot programs also will facilitate health system strengthening at the participating institutions, which includes

Full Text Available The presence of the Internet of Things (IoT in healthcare through the use of mobile medical applications and wearable devices allows patients to capture their healthcare data and enables healthcare professionals to be up-to-date with a patient’s status. Ambient Assisted Living (AAL, which is considered as one of the major applications of IoT, is a home environment augmented with embedded ambient sensors to help improve an individual’s quality of life. This domain faces major challenges in providing safety and security when accessing sensitive health data. This paper presents an access control framework for AAL which considers multi-level access and privacy preservation. We focus on two major points: (1 how to use the data collected from ambient sensors and biometric sensors to perform the high-level task of activity recognition; and (2 how to secure the collected private healthcare data via effective access control. We achieve multi-level access control by extending Public Key Infrastructure (PKI for secure authentication and utilizing Attribute-Based Access Control (ABAC for authorization. The proposed access control system regulates access to healthcare data by defining policy attributes over healthcare professional groups and data classes classifications. We provide guidelines to classify the data classes and healthcare professional groups and describe security policies to control access to the data classes.

This book is devoted to modeling of multi-level complex systems, a challenging domain for engineers, researchers and entrepreneurs, confronted with the transition from learning and adaptability to evolvability and autonomy for technologies, devices and problem solving methods. Chapter 1 introduces the multi-scale and multi-level systems and highlights their presence in different domains of science and technology. Methodologies as, random systems, non-Archimedean analysis, category theory and specific techniques as model categorification and integrative closure, are presented in chapter 2. Chapters 3 and 4 describe polystochastic models, PSM, and their developments. Categorical formulation of integrative closure offers the general PSM framework which serves as a flexible guideline for a large variety of multi-level modeling problems. Focusing on chemical engineering, pharmaceutical and environmental case studies, the chapters 5 to 8 analyze mixing, turbulent dispersion and entropy production for multi-scale sy...

ACCESS: Absolute Color Calibration Experiment for Standard Stars is a series of rocket-borne sub-orbital missions and ground-based experiments designed to leverage significant technological advances in detectors, instruments, and the precision of the fundamental laboratory standards used to calibrate these instruments to enable improvements in the precision of the astrophysical flux scale through the transfer of laboratory absolute detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35 to 1.7 micron bandpass.A cross wavelength calibration of the astrophysical flux scale to this level of precision over this broad a bandpass is relevant for the data used to probe fundamental astrophysical problems such as the SNeIa photometry based measurements used to constrain dark energy theories.We will describe the strategy for achieving this level of precision, the payload and calibration configuration, present sub-system test data, and the status and preliminary performance of the integration and test of the spectrograph and telescope. NASA APRA sounding rocket grant NNX14AH48G supports this work.

This article presents a dynamic random access scheme for orthogonal frequency division multiple access (OFDMA) systems. The key features of the proposed scheme are:it is a combination of both the distributed and the centralized schemes, it can accommodate several delay sensitivity classes,and it can adjust the number of random access channels in a media access control (MAC) frame and the access probability according to the outcome of Mobile Terminals access attempts in previous MAC frames. For floating populated packet-based networks, the proposed scheme possibly leads to high average user satisfaction.

Conventional cloud radio access networks assume single cloud processing and treat inter-cloud interference as background noise. This paper considers the downlink of a multi-cloud radio access network (CRAN) where each cloud is connected to several

In today's electronic learning environment, access to appropriate systems and data is of the utmost importance to students, faculty, and staff. Without proper access to the school's internal systems, teachers could be prevented from logging on to an online learning system and students might be unable to submit course work to an online…

Access Control and Personal Identification Systems provides an education in the field of access control and personal identification systems, which is essential in selecting the appropriate equipment, dealing intelligently with vendors in purchases of the equipment, and integrating the equipment into a total effective system. Access control devices and systems comprise an important part of almost every security system, but are seldom the sole source of security. In order for the goals of the total system to be met, the other portions of the security system must also be well planned and executed

Coupled-bunch instabilities excited by the interaction of the particle beam with its surroundings can seriously limit the performance of circular particle accelerators. These instabilities can be cured by the use of active feedback systems based on sensors capable of detecting the unwanted beam motion and actuators that apply the feedback correction to the beam. Advances in electronic technology now allow the implementation of feedback loops using programmable digital systems. Besides important advantages in terms of flexibility and reproducibility, digital systems open the way to the use of novel diagnostic tools and additional features. We first introduce coupled-bunch instabilities, analysing the equation of motion of charged particles and the different modes of oscillation of a multi-bunch beam, showing how they can be observed and measured. Different types of feedback systems will then be presented as examples of real implementations that belong to the history of multi-bunch feedback systems. The main co...

The LHC complex is divided into a number of zones with different levels of access controls.Inside the interlocked areas, the personnel protection is ensured by the LHC AccessSystem.The system is made of two parts:the LHC Access Safety System and the LHC Access Control System. During machine operation,the LHC Access Safety System ensures the collective protection of the personnel against the radiation hazards arising from the operation of the accelerator by interlocking the LHC key safety elements. When the beams are off, the LHC Access Control System regulates the access to the accelerator and its many subsystems.It allows a remote, local or automatic operation of the access control equipment which verifies and identifies all users entering the controlled areas.The global architecture of the LHC AccessSystem is now designed and is being validated to ensure that it meets the safety requirements for operation of the LHC.A pilot installation will be tested in the summer 2005 to validate the concept with the us...

Coupled-bunch instabilities excited by the interaction of the particle beam with its surroundings can seriously limit the performance of circular particle accelerators. These instabilities can be cured by the use of active feedback systems based on sensors capable of detecting the unwanted beam motion and actuators that apply the feedback correction to the beam. The advances in electronic technology now allow the implementation of feedback loops using programmable digital systems. Besides important advantages in terms of flexibility and reproducibility, digital systems open the way to the use of novel diagnostic tools and additional features. The lecture will first introduce coupled-bunch instabilities analysing the equation of motion of charged particles and the different modes of oscillation of a multi-bunch beam, showing how they can be observed and measured. Different types of feedbacks systems will then be presented as examples of real implementations that belong to the history of multi-bunch feedback sy...

The aim of this paper is to describe a developed application of Simple Object Access Protocol (SOAP) as a model for improving libraries’ digital content findability on the library web. The study applies XML text-based protocol tools in the collection of data about libraries’ visibility performance in the search results of the book. Model from the integrated Web Service Document Language (WSDL) and Universal Description, Discovery and Integration (UDDI) are applied to analyse SOAP as element within the system. The results showed that the developed application of SOAP with multi-tier architecture can help people simply access the website in the library server Gorontalo Province and support access to digital collections, subscription databases, and library catalogs in each library in Regency or City in Gorontalo Province.

Full Text Available Along this paper, we present a new multi agent-based system to gather waste on cities and villages. We have developed a low cost wireless sensor prototype to measure the volume level of the containers. Furthermore a route system is developed to optimize the routes of the trucks and a mobile application has been developed to help drivers in their working days. In order to evaluate and validate the proposed system a practical case study in a real city environment is modeled using open data available and with the purpose of identifying limitations of the system.

Coupled-bunch instabilities excited by the interaction of the particle beam with its surroundings can seriously limit the performance of circular particle accelerators. These instabilities can be cured by the use of active feedback systems based on sensors capable of detecting the unwanted beam motion and actuators that apply the feedback correction to the beam. Advances in electronic technology now allow the implementation of feedback loops using programmable digital systems. Besides important advantages in terms of flexibility and reproducibility, digital systems open the way to the use of novel diagnostic tools and additional features. We first introduce coupled-bunch instabilities, analysing the equation of motion of charged particles and the different modes of oscillation of a multi-bunch beam, showing how they can be observed and measured. Different types of feedback systems will then be presented as examples of real implementations that belong to the history of multi-bunch feedback systems. The main components of a feedback system and the related issues will also be analysed. Finally, we shall focus on digital feedback systems, their characteristics, and features, as well as on how they can be concretely exploited for both the optimization of feedback performance and for beam dynamics studies

Full Text Available This paper presents the study of a novel all-optical method for processing optical CDMA signals towards improving suppression of multiaccess interference. The main focus is on incoherent OCDMA systems using multiwavelength 2D-WH/TS codes generated using FBG based encoders and decoders. The MAI suppression capabilities based on its ability to eliminate selective wavelength pulse processing have been shown. A novel transmitter architecture that achieves up to 3dB power saving was also presented. As a result of hardware savings, processing cost will be significantly reduced and power budget improvement resulted in improved performance.

The information technology revolution has transformed all aspects of our society including critical infrastructures and led a significant shift from their old and disparate business models based on proprietary and legacy environments to more open and consolidated ones. Supervisory Control and Data Acquisition (SCADA) systems have been widely used not only for industrial processes but also for some experimental facilities. Due to the nature of open environments, managing SCADA systems should meet various security requirements since system administrators need to deal with a large number of entities and functions involved in critical infrastructures. In this paper, we identify necessary access control requirements in SCADA systems and articulate access control policies for the simulated SCADA systems. We also attempt to analyze and realize those requirements and policies in the context of role-based access control that is suitable for simplifying administrative tasks in large scale enterprises.

The present paper discusses the managerial challenges of the MUSE integrated project on multi service broadband access. It addresses different aspects such as matrix organisation, project office, consensus process, standardisation, dissemination, and quality control.

Current performance estimates for personnel access control systems use estimates of Type I and Type II verification errors. A system performance equation which addresses normal operation, the insider, and outside adversary attack is developed. Examination of this equation reveals the inadequacy of classical Type I and II error evaluations which require detailed knowledge of the adversary threat scenario for each specific installation. Consequently, new performance measures which are consistent with the performance equation and independent of the threat are developed as an aid in selecting personnel access control systems

The Distributed Access View Integrated Database (DAVID) System, which was adopted by the Astrophysics Division for their Astrophysics Data System, is a solution to the system heterogeneity problem. The heterogeneous components of the Astrophysics problem is outlined. The Library and Library Consortium levels of the DAVID approach are described. The 'books' and 'kits' level is discussed. The Universal Object Typer Management System level is described. The relation of the DAVID project with the Small Business Innovative Research (SBIR) program is explained.

Conventional cloud radio access networks assume single cloud processing and treat inter-cloud interference as background noise. This paper considers the downlink of a multi-cloud radio access network (CRAN) where each cloud is connected to several base-stations (BS) through limited-capacity wireline backhaul links. The set of BSs connected to each cloud, called cluster, serves a set of pre-known mobile users (MUs). The performance of the system becomes therefore a function of both inter-cloud and intra-cloud interference, as well as the compression schemes of the limited capacity backhaul links. The paper assumes independent compression scheme and imperfect channel state information (CSI) where the CSI errors belong to an ellipsoidal bounded region. The problem of interest becomes the one of minimizing the network total transmit power subject to BS power and quality of service constraints, as well as backhaul capacity and CSI error constraints. The paper suggests solving the problem using the alternating direction method of multipliers (ADMM). One of the highlight of the paper is that the proposed ADMM-based algorithm can be implemented in a distributed fashion across the multi-cloud network by allowing a limited amount of information exchange between the coupled clouds. Simulation results show that the proposed distributed algorithm provides a similar performance to the centralized algorithm in a reasonable number of iterations.

In this dissertation, we discuss the problem of enabling cooperative query execution in a multi-cloud environment where the data is owned and managed by multiple enterprises. Each enterprise maintains its own relational database using a private cloud. In order to implement desired business services, parties need to share selected portion of their…

Wide-band Code Division Multiple Access (WCDMA) systems are considered to be among the best alternatives for Universal Mobile Telecommunication System (UMTS). In future deployment of WCDMA systems, spectrum overlay among sub-bands with different bandwidth is necessary to support various kinds of ...... of virtual channel so that classical teletraffic theory can be applied. A service class is modelled as a BPP (Binomial-Poisson-Pascal) multi-rate traffic stream....

In the domain of Safety Real-Time Systems the problem of testing represents always a big effort in terms of time, costs and efficiency to guarantee an adequate coverage degree. Exhaustive tests may, in fact, not be practicable for large and distributed systems. This paper describes the testing process followed during the validation of the CERN's LHC AccessSystem [1], responsible for monitoring and preventing physical risks for the personnel accessing the underground areas. In the paper we also present a novel strategy for the testing problem, intended to drastically reduce the time for the test patterns generation and execution. In particular, we propose a methodology for blackbox testing that relies on the application of Model Checking techniques. Model Checking is a formal method from computer science, commonly adopted to prove correctness of systemâs models through an automatic systemâs state space exploration against some property formulas.

Multi-agent systems are complex systems in which multiple autonomous entities, called agents, cooperate in order to achieve a common or personal goal. These entities may be computer software, robots, and also humans. In fact, many multi-agent systems are intended to operate in cooperation with or as

Modern authorization systems span domains of administration, rely on many different authentication sources, and manage complex attributes as part of the authorization process. This . paper presents Cardea, a distributed system that facilitates dynamic access control, as a valuable piece of an inter-operable authorization framework. First, the authorization model employed in Cardea and its functionality goals are examined. Next, critical features of the system architecture and its handling of the authorization process are then examined. Then the S A M L and XACML standards, as incorporated into the system, are analyzed. Finally, the future directions of this project are outlined and connection points with general components of an authorization system are highlighted.

The Radiation Security System (RSS) at the Los Alamos Neutron Science Center (LANSCE) provides personnel protection from prompt radiation due to accelerated beam. The Personnel Access Control System (PACS) is a component of the RSS that is designed to prevent personnel access to areas where prompt radiation is a hazard. PACS was designed to replace several older personnel safety systems (PSS) with a single modem unified design. Lessons learned from the operation over the last 20 years were incorporated into a redundant sensor, single-point failure safe, fault tolerant, and tamper-resistant system that prevents access to the beam areas by controlling the access keys and beam stoppers. PACS uses a layered philosophy to the physical and electronic design. The most critical assemblies are battery backed up, relay logic circuits; less critical devices use Programmable Logic Controllers (PLCs) for timing functions and communications. Outside reviewers have reviewed the operational safety of the design. The design philosophy, lessons learned, hardware design, software design, operation, and limitations of the device are described

Full Text Available During the execution of large scale construction projects performed by Virtual Organizations (VO, relatively complex technical models have to be exchanged between the VO members. For linking the trade and transfer of these models, a so-called multi-model container format was developed. Considering the different skills and tasks of the involved partners, it is not necessary for them to know all the models in every technical detailing. Furthermore, the model size can lead to a delay in communication. In this paper an approach is presented for defining model cut-outs according to the current project context. Dynamic dependencies to the project context as well as static dependencies on the organizational structure are mapped in a context-sensitive rule. As a result, an approach for dynamic filtering of multi-models is obtained which ensures, together with a filtering service, that the involved VO members get a simplified view of complex multi-models as well as sufficient permissions depending on their tasks.

This book provides a description of advanced multi-agent and artificial intelligence technologies for the modeling and simulation of complex systems, as well as an overview of the latest scientific efforts in this field. A complex system features a large number of interacting components, whose aggregate activities are nonlinear and self-organized. A multi-agent system is a group or society of agents which interact with others cooperatively and/or competitively in order to reach their individual or common goals. Multi-agent systems are suitable for modeling and simulation of complex systems, which is difficult to accomplish using traditional computational approaches.

A Web-based accesssystem to climate model output data sets for intercomparison and analysis has been produced, using the NOAA-PMEL developed Live Access Server software as host server and Ferret as the data serving and visualization engine. Called ARCAS ("ACACIA Regional Climate-data AccessSystem"), and publicly accessible at http://dataserver.ucar.edu/arcas, the site currently serves climate model outputs from runs of the NCAR Climate System Model for the 21st century, for Business as Usual and Stabilization of Greenhouse Gas Emission scenarios. Users can select, download, and graphically display single variables or comparisons of two variables from either or both of the CSM model runs, averaged for monthly, seasonal, or annual time resolutions. The time length of the averaging period, and the geographical domain for download and display, are fully selectable by the user. A variety of arithmetic operations on the data variables can be computed "on-the-fly", as defined by the user. Expansions of the user-selectable options for defining analysis options, and for accessing other DOD-compatible ("Distributed Ocean Data System-compatible") data sets, residing at locations other than the NCAR hardware server on which ARCAS operates, are planned for this year. These expansions are designed to allow users quick and easy-to-operate web-based access to the largest possible selection of climate model output data sets available throughout the world.

Our goal in this program is to develop Fast Access Data Acquisition System (FADAS) by combining the flexibility of Multilink's GaAs and InP electronics and electro-optics with an extremely high data rate for the efficient handling and transfer of collider experimental data. This novel solution is based on Multilink's and Los Alamos National Laboratory's (LANL) unique components and technologies for extremely fast data transfer, storage, and processing

Visible light communication (VLC) is a promising candidate for short-range broadband access due to its integration of advantages for both optical communication and wireless communication, whereas multi-user access is a key problem because of the intra-cell and inter-cell interferences. In addition, the non-flat channel effect results in higher losses for users in high frequency bands, which leads to unfair qualities. To solve those issues, we propose a power adaptive multi-filter carrierless amplitude and phase access (PA-MF-CAPA) scheme, and in the first step of this scheme, the MF-CAPA scheme utilizing multiple filters as different CAP dimensions is used to realize multi-user access. The character of orthogonality among the filters in different dimensions can mitigate the effect of intra-cell and inter-cell interferences. Moreover, the MF-CAPA scheme provides different channels modulated on the same frequency bands, which further increases the transmission rate. Then, the power adaptive procedure based on MF-CAPA scheme is presented to realize quality fairness. As demonstrated in our experiments, the MF-CAPA scheme yields an improved throughput compared with multi-band CAP access scheme, and the PA-MF-CAPA scheme enhances the quality fairness and further improves the throughput compared with the MF-CAPA scheme.

A new Integrated Data Access Management system, IDAM, has been created to address specific data management issues of the MAST spherical Tokamak. For example, this system enables access to numerous file formats, both legacy and modern (IDA, Ufile, netCDF, HDF5, MDSPlus, PPF, JPF). It adds data quality values at the signal level, and automatically corrects for problems in data: in timings, calibrations, and labelling. It also builds new signals from signal components. The IDAM data server uses a hybrid XML-relational database to record how data are accessed, whether locally or remotely, and how alias and generic signal names are mapped to true names. Also, XML documents are used to encode the details of data corrections, as well as definitions of composite signals and error models. The simple, user friendly, API and accessor function library, written in C on Linux, is available for applications in C, C++, IDL and Fortran-90/95/2003 with good performance: a MAST plasma current trace (28 kbytes of data), requested using a generic name and with data corrections applied, is delivered over a 100 Mbit/s network in ∼13 ms

Developing an Open Access, multi-institutional, multilingual, international digital library requires robust technological and institutional infrastructures that support both the needs of individual institutions alongside the needs of the growing partnership and ensure continuous communication and development of the shared vision for the digital…

The densities of accessible final states for calculations of multi-step compound reactions are derived. The Pauli exclusion principle is taken into account in the calculations. The results are compared with a previous author's results and the effect of the Pauli exclusion principle is investigated. (Author)

This paper discusses a virtual world for presenting multi-media information and for natural interactions with the environment to get access to this information. Apart from mouse and keyboard input, interactions take place using speech and language. It is shown how this virtual environment can be

At the time of this reporting, there are 2,589 rich mobile devices used at JPL, including 1,550 iPhones and 968 Blackberrys. Considering a total JPL population of 5,961 employees, mobile applications have a total addressable market of 43 percent of the employees at JPL, and that number is rising. While it was found that no existing desktop tools can realistically be replaced by a mobile application, there is certainly a need to improve access to these desktop tools. When an alarm occurs and an engineer is away from his desk, a convenient means of accessing relevant data can save an engineer a great deal of time and improve his job efficiency. To identify which data is relevant, an engineer benefits from a succinct overview of the data housed in 13+ tools. This need can be well met by a single, rich, mobile application that provides access to desired data across tools in the ops infrastructure.

In this paper, a novel predictive maintenance policy with multi-level decision-making is proposed for multi-component system with complex structure. The main idea is to propose a decision-making process considered on two levels: system level and component one. The goal of the decision rules at the system level is to address if preventive maintenance actions are needed regarding the predictive reliability of the system. At component level the decision rules aim at identifying optimally a group of several components to be preventively maintained when preventive maintenance is trigged due to the system level decision. Selecting optimal components is based on a cost-based group improvement factor taking into account the predictive reliability of the components, the economic dependencies as well as the location of the components in the system. Moreover, a cost model is developed to find the optimal maintenance decision variables. A 14-component system is finally introduced to illustrate the use and the performance of the proposed predictive maintenance policy. Different sensitivity analysis are also investigated and discussed. Indeed, the proposed policy provides more flexibility in maintenance decision-making for complex structure systems, hence leading to significant profits in terms of maintenance cost when compared with existing policies. - Highlights: • A predictive maintenance policy for complex structure systems is proposed. • Multi-level decision process based on prognostic results is proposed. • A cost-based group importance measure is introduced for decision-making. • Both positive and negative dependencies between components are investigated. • A cost model and Monte Carlo simulation are developed for optimization process.

Nature of being accessible to all categories of users is one of the primary factors for enabling the wider reach of the resources published through World Wide Web. The accessibility of websites has been analyzed through W3C guidelines with the help of various tools. This paper presents a multi-tool accessibility assessment of government department websites belonging to the Indian state of Jammu and Kashmir. A comparative analysis of six accessibility tools is also presented with 14 different parameters. The accessibility analysis tools used in this study for analysis are aChecker, Cynthia Says, Tenon, wave, Mauve, and Hera. These tools provide us the results of selected websites accessibility status on Web Content Accessibility Guidelines (WCAG) 1.0 and 2.0. It was found that there are variations in accessibility analysis results when using different accessibility metrics to measure the accessibility of websites. In addition to this, we have identified the guidelines which have frequently been violated. It was observed that there is a need for incorporating the accessibility component features among the selected websites. This paper presents a set of suggestions to improve the accessibility status of these sites so that the information and services provided by these sites shall reach a wider spectrum of audience without any barrier. Implications for rehabilitation The following points indicates that this case study of JKGAD websites comes under Rehabilitation focused on Visually Impaired users. Due to the universal nature of web, it should be accessible to all according to WCAG guidelines framed by World Wide Web Consortium. In this paper we have identified multiple accessibility barriers for persons with visual impairment while browsing the Jammu and Kashmir Government websites. Multi-tool analysis has been done to pin-point the potential barriers for persons with visually Impaired. Usability analysis has been performed to check whether these websites are suitable

Various potential architectures of branching units for multi-core fiber undersea transmission systems are presented. It is also investigated how different architectures of branching unit influence the number of fibers and those of inline components.......Various potential architectures of branching units for multi-core fiber undersea transmission systems are presented. It is also investigated how different architectures of branching unit influence the number of fibers and those of inline components....

Full Text Available The task of administering software-information complex occurs duringthe development of application systems for managing business-processes and is connected with the organization of access forusers to information resources in conditions of multi-user information systems for management. For solution of this problem proposed theapproach, which is based on a hierarchical system of access rightsto information resources on the levels: tool, object and procedural.Keywords: software-information complex, information resources,administering, permissions, separation of powers, access model.

Full Text Available The project presented here is a first step towards building a more accessible world through Payment Systems and a successful implementation of a User Centred Design. By means of a beep-system, a Point of Sale (POS payment device informs the user of those transaction steps that require his/her attention at the moment of payment, such as when: the card has been successfully read, the Personal Identification Number (PIN must be entered, the transaction has been successfully processed and the transaction has not been completed due to an error. The proposed solution increases the personal autonomy and security of blind people when paying at a merchant.

Full Text Available In this paper, for multi-rate wireless local area networks (WLANs, a modified protocol in Medium Access Control (MAC, called Modified Cooperative Access with Relay’s Data (MCARD based Directional Antenna using half wave length dipole in Uniform Circular Array (UCA topology is proposed. MCARD gives remote stations chance to send their information by using intermediate stations (relays to Access Point (AP at a higher data rate based practical antenna. As can be seen under MCARD, a relay station transmits its information before forwarding information from the source station because it uses directional antenna. Analytical results and simulations show that MCARD can significantly improve system quality of service (QOS in terms of throughput under different channel conditions.

By 2004, microprocessor design focused on multicore scaling—increasing the number of cores per die in each generation—as the primary strategy for improving performance. These multicore processors typically equip multiple memory subsystems to improve data throughput. In addition, these systems employ heterogeneous processors such as GPUs and heterogeneous memories like non-volatile memory to improve performance, capacity, and energy efficiency. With the increasing volume of hardware resources and system complexity caused by heterogeneity, future systems will require intelligent ways to manage hardware resources. Early research to improve performance and energy efficiency on heterogeneous, multi-core, multi-memory systems focused on tuning a single primitive or at best a few primitives in the systems. The key limitation of past efforts is their lack of a holistic approach to resource management that balances the tradeoff between performance and energy consumption. In addition, the shift from simple, homogeneous systems to these heterogeneous, multicore, multi-memory systems requires in-depth understanding of efficient resource management for scalable execution, including new models that capture the interchange between performance and energy, smarter resource management strategies, and novel low-level performance/energy tuning primitives and runtime systems. Tuning an application to control available resources efficiently has become a daunting challenge; managing resources in automation is still a dark art since the tradeoffs among programming, energy, and performance remain insufficiently understood. In this dissertation, I have developed theories, models, and resource management techniques to enable energy-efficient execution of parallel applications through thread and data management in these heterogeneous multi-core, multi-memory systems. I study the effect of dynamic concurrent throttling on the performance and energy of multi-core, non-uniform memory access

Full Text Available The article is devoted to the method of information-flow-based access control, adopted for virtualized systems. General structure of access control system for virtual infrastructure is proposed.

An internet-accessible real-time weather information system has been developed. This system provides real-time accessibility to weather information from a multitude of spatially distributed weather stations. The Internet connectivity also offers...

In order to accommodate the increasing number of computerized subsystems aboard today's more fuel efficient aircraft, the Boeing Co. has developed the DATAC (Digital Autonomous Terminal Access Control) bus to minimize the need for point-to-point wiring to interconnect these various systems, thereby reducing total aircraft weight and maintaining an economical flight configuration. The DATAC bus is essentially a local area network providing interconnections for any of the flight management and control systems aboard the aircraft. The task of developing a Bus Monitor Unit was broken down into four subtasks: (1) providing a hardware interface between the DATAC bus and the Z8000-based microcomputer system to be used as the bus monitor; (2) establishing a communication link between the Z8000 system and a CP/M-based computer system; (3) generation of data reduction and display software to output data to the console device; and (4) development of a DATAC Terminal Simulator to facilitate testing of the hardware and software which transfer data between the DATAC's bus and the operator's console in a near real time environment. These tasks are briefly discussed.

Nowadays, access control is an indispensable part of the Personal Health Record and supplies for its confidentiality by enforcing policies and rules to ensure that only authorized users gain access to requested resources in the system. In other words, the access control means protecting patient privacy in healthcare systems. Attribute-Based Access Control (ABAC) is a new access control model that can be used instead of other traditional types of access control such as Discretionary Access Control, Mandatory Access Control, and Role-Based Access Control. During last five years ABAC has shown some applications in both recent academic fields and industry purposes. ABAC by using user’s attributes and resources, makes a decision according to an access request. In this paper, we propose an ABAC framework for healthcare system. We use the engine of ABAC for rendering and enforcing healthcare policies. Moreover, we handle emergency situations in this framework.

We provide a detailed description of the Jason-DTU system, including the used methodology, tools as well as team strategy. We also discuss the experience gathered in the contest. In spring 2009 the course “Artificial Intelligence and Multi- Agent Systems” was held for the first time...... on the Technical University of Denmark (DTU). A part of this course was a short introduction to the multi-agent framework Jason, which is an interpreter for AgentSpeak, an agent-oriented programming language. As the final project in this course a solution to the Multi-Agent Programming Contest from 2007, the Gold...

The NASA Kepler spacecraft has detected 170 candidate multi-planet systems in the first two quarters of data released in February 2011 by Borucki et al. (2011). These systems comprise 115 double candidate systems, 45 triple candidate sys- tems, and 10 systems with 4 or more candidate planets. The architecture and dynamics of these systems were discussed by Lissauer et al. (2011), and a comparison of candidates in single- and multi-planet systems was presented by Latham et al. (2011). Proceeding from "planetary candidate" systems to confirmed and validated multi-planet systems is a difficult process, as most of these systems orbit stars too faint to obtain extremely precise (1ms-1) radial velocity confimation. Here, we discuss in detail the use of transit timing vari- ations (cf. e.g. Holman et al., 2010) to confirm planets near a mean motion resonance. We also discuss extensions to the BLENDER validation (Torres et al., 2004, 2011; Fressin et al., 2011) to validate planets in multi-planet systems. Kepler was competitively selected as the tenth Discovery mission. Funding for the Kepler Mis- sion is provided by NASA's Science Mission Direc- torate. We are deeply grateful for the very hard work of the entire Kepler team.

Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.

We describe the approach used to develop the multi-agent system of herders that competed as the Jason-DTU team at the Multi-Agent Programming Contest 2010. We also participated in 2009 with a system developed in the agentoriented programming language Jason which is an extension of AgentSpeak. We ...... used the implementation from 2009 as a foundation and therefore much of the work done this year was on improving that implementation. We present a description which includes design and analysis of the system as well as the main features of our agent team strategy. In addition we discuss...

The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

This contribution is timely as it addresses accessibility in regards system hardware and software aligned with introduction of the Twenty-First Century Communications and Video Accessibility Act (CVAA) and adjoined game industry waiver that comes into force January 2017. This is an act created...... by the USA Federal Communications Commission (FCC) to increase the access of persons with disabilities to modern communications, and for other purposes. The act impacts advanced communications services and products including text messaging; e-mail; instant messaging; video communications; browsers; game...... platforms; and games software. However, the CVAA has no legal status in the EU. This text succinctly introduces and questions implications, impact, and wider adoption. By presenting the full CVAA and game industry waiver the text targets to motivate discussions and further publications on the subject...

Multi-agent systems are promising as models of organization because they are based on the idea that most work in human organizations is done based on intelligence, communication, cooperation, and massive parallel processing. They offer an alternative for system theories of organization, which are

The Semantic web has given a great deal of impetus to the development of ontologies and multi-agent systems. Several books have appeared which discuss the development of ontologies or of multi-agent systems separately on their own. The growing interaction between agents and ontologies has highlighted the need for integrated development of these. This book is unique in being the first to provide an integrated treatment of the modeling, design and implementation of such combined ontology/multi-agent systems. It provides clear exposition of this integrated modeling and design methodology. It further illustrates this with two detailed case studies in (a) the biomedical area and (b) the software engineering area. The book is, therefore, of interest to researchers, graduate students and practitioners in the semantic web and web science area. (orig.)

An integrated user access control method was proposed to address the issues of security and management in networked manufacturing systems (NMS).Based on the analysis of the security issues in networked manufacturing system,an integrated user access control method composed of role-based access control (RBAC),task-based access control (TBAC),relationship-driven access control (RDAC)and coalition-based access control (CBAC) was proposed,including the hierarchical user relationship model,the reference model and the process model.The elements and their relationships were defined,and the expressions of constraints authorization were given.The extensible access control markup language (XACML) was used to implement this method.This method was used in the networked manufacturing system in the Shaoxing spinning region of China.The results show that the integrated user access control method can reduce the costs of system security maintenance and management.

Nigerian Journal of Technology ... identity management and access control and the unavailability of actionable information on pattern of ... This Tertiary Identity and Access Management System (T-IAMS) is a fingerprint biometric database that ...

Methodological Guidelines for Modeling and Developing MAS-Based SimulationsThe intersection of agents, modeling, simulation, and application domains has been the subject of active research for over two decades. Although agents and simulation have been used effectively in a variety of application domains, much of the supporting research remains scattered in the literature, too often leaving scientists to develop multi-agent system (MAS) models and simulations from scratch. Multi-Agent Systems: Simulation and Applications provides an overdue review of the wide ranging facets of MAS simulation, i

Coupled-bunch instabilities excited by the interaction of the particle beam with its surroundings can seriously limit the performance of circular particle accelerators. These instabilities can be cured by the use of active feedback systems based on sensors capable of detecting the unwanted beam motion and actuators that apply the feedback correction to the beam. Advances in electronic technology now allow the implementation of feedback loops using programmable digital systems. Besides importa...

In nuclear spectroscopy applications, it is often desired to acquire data at high rate with high resolution. With the availability of low cost computers, it is possible to make a powerful data acquisition system with minimum hardware and software development, by designing a PC plug-in acquisition board. But in using the PC processor for data acquisition, the PC can not be used as a multitasking node. Keeping this in view, PC plug-in acquisition boards with on-board processor find tremendous applications. Transputer based data acquisition board has been designed which can be configured as a high count rate pulse height MCA or as a Multi Spectral Scaler. Multi Spectral Scaling (MSS) is a new technique, in which multiple spectra are acquired in small time frames and are then analyzed. This paper describes the details of this multi spectral scaling data acquisition system. 2 figs

Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.

HIAPER Pole-to-Pole Observations (HIPPO) was an NSF- and NOAA-funded, multi-year global airborne research project to survey the latitudinal and vertical distribution of greenhouse and related gases, and aerosols. Project scientists and support staff flew five month-long missions over the Pacific Basin on the NSF/NCAR Gulfstream V, High-performance Instrumented Airborne Platform for Environmental Research (HIAPER) aircraft between January 2009 and September 2011, spread throughout the annual cycle, from the surface to 14 km in altitude, and from 87°N to 67°S. Data from the HIPPO study of greenhouse gases and aerosols are now available to the atmospheric research community and the public. This comprehensive dataset provides the first high-resolution vertically resolved measurements of over 90 unique atmospheric species from nearly pole-to-pole over the Pacific Ocean across all seasons. The suite of atmospheric trace gases and aerosols is pertinent to understanding the carbon cycle and challenging global climate models. This dataset will provide opportunities for research across a broad spectrum of Earth sciences, including those analyzing the evolution in time and space of the greenhouse gases that affect global climate. The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) provides data management support for the HIPPO experiment including long-term data storage and dissemination. CDIAC has developed a relational database to house HIPPO merged 10-second meteorology, atmospheric chemistry, and aerosol data. This data set provides measurements from all Missions, 1 through 5, that took place from January of 2009 to September 2011. This presentation introduces newly build database and web interface, reflects the present state and functionality of the HIPPO Database and Exploration System as well as future plans for expansion and inclusion of combined discrete flask and GC sample GHG, Halocarbon, and hydrocarbon data.

Background Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging...

Objective. People who suffer from hearing impairments can find it difficult to follow a conversation in a multi-speaker environment. Current hearing aids can suppress background noise; however, there is little that can be done to help a user attend to a single conversation amongst many without knowing which speaker the user is attending to. Cognitively controlled hearing aids that use auditory attention decoding (AAD) methods are the next step in offering help. Translating the successes in AAD research to real-world applications poses a number of challenges, including the lack of access to the clean sound sources in the environment with which to compare with the neural signals. We propose a novel framework that combines single-channel speech separation algorithms with AAD. Approach. We present an end-to-end system that (1) receives a single audio channel containing a mixture of speakers that is heard by a listener along with the listener’s neural signals, (2) automatically separates the individual speakers in the mixture, (3) determines the attended speaker, and (4) amplifies the attended speaker’s voice to assist the listener. Main results. Using invasive electrophysiology recordings, we identified the regions of the auditory cortex that contribute to AAD. Given appropriate electrode locations, our system is able to decode the attention of subjects and amplify the attended speaker using only the mixed audio. Our quality assessment of the modified audio demonstrates a significant improvement in both subjective and objective speech quality measures. Significance. Our novel framework for AAD bridges the gap between the most recent advancements in speech processing technologies and speech prosthesis research and moves us closer to the development of cognitively controlled hearable devices for the hearing impaired.

We consider deterministic dynamic systems with state space representations which are dissipative in the sense of Willems (1972) with respect to several supply rates. This property is of interest in robustness analysis and in multi-objective control. We give conditions under which the convex cone...

The author introduces the methods to contrive the hardware and software of a Multi-function Nuclear Weight Scale System based on the communication contract in compliance with RS485 between a master (industrial control computer 386) and a slave (single chip 8098) and its main functions

In this position summary we present work in progress on a worldwide, scalable multi-agent system, based on a paradigm of hyperlinked rooms. The framework offers facilities for managing distribution, security and mobility aspects for both active elements (agents) and passive elements (objects) in the

With the advent of new Radio Access Technologies (RATs), it is inevitable that several RATs will co-exist, especially in the license-exempt band. In this letter, we present an in-depth adaptation of the proactive time-rearrangement (PATRA) scheme for IEEE 802.11 WLAN. The PATRA is a time division approach for reducing interference from a multi-radio device. Because IEEE 802.11 is based on carrier sensing and contention mechanism, it is the most suitable candidate to adapt the PATRA.

Efficient query processing in high-dimensional search spaces is an important requirement for many analysis tools. In the literature on index data structures one can find a wide range of methods for optimising database access. In particular, bitmap indices have recently gained substantial popularity in data warehouse applications with large amounts of read mostly data. Bitmap indices are implemented in various commercial database products and are used for querying typical business applications. However, scientific data that is mostly characterised by non-discrete attribute values cannot be queried efficiently by the techniques currently supported. In this thesis we propose a novel access method based on bitmap indices that efficiently handles multi-dimensional queries against typical scientific data. The algorithm is called GenericRangeEval and is an extension of a bitmap index for discrete attribute values. By means of a cost model we study the performance of queries with various selectivities against uniform...

Operating systems rely heavily on access control mechanisms to achieve security goals and defend against remote and local attacks. The complexities of modern access control mechanisms and the scale of policy configurations are often overwhelming to system administrators and software developers. Therefore, mis-configurations are common, and the…

A performance optimization based on the optimal switching threshold(s) for a multi-branch switched diversity system is discussed in this paper. For the conventional multi-branch switched diversity system with a single switching threshold

... National Instant Criminal Background Check System § 25.6 Accessing records in the system. (a) FFLs may... through the NCIC communication network. Upon receiving a request for a background check from an FFL, a POC...

Full Text Available The purpose of this paper is the general problems of information security in access control system. The field of using is the in project of reconstruction Physical protection system.

System Hardening Architecture for Safer Access to Critical Business Data. ... and the threat is growing faster than the potential victims can deal with. ... in this architecture are applied to the host, application, operating system, user, and the ...

Full Text Available The high bandwidth demand of Internet applications has recently driven the need of increasing the residential download speed. A practical solution to the problem is to aggregate the bandwidth of 802.11 access points (APs backhauls in range. Since 802.11 devices are usually single radio, the communication to APs on different radio channels requires a time-division multiple access (TDMA policy at the client station. With an in-depth experimental analysis and a customized 802.11 driver, in this paper, we show that the usage of multi-AP TDMA policy may cause degradation of the TCP throughput and an underutilization of the AP backhauls. We then introduce a simple analytical model that accurately predicts the TCP round-trip time (RTT with a multi-AP TDMA policy and propose a resource allocation algorithm to reduce the observed TCP RTT with a very low computational cost. Our proposed scheme runs locally at the client station and improves the aggregate throughput up to 1.5 times compared to state-of-the-art allocations. We finally show that the throughput achieved by our algorithm is very close to the theoretical upper bound in key simulation scenarios.

Multi-beam directional systems are a novel approach to networking which leverage recent advances in physical layer technology, allowing formation of...for a programmatic method for setting up emulation experiments. Rather than hard code all of the underlying pieces for EMANE (such as the over-the-air

Autonomous formation control of multi-agent dynamic systems has a number of applications that include ground-based and aerial robots and satellite formations. For air vehicles, formation flight ("flocking") has the potential to significantly increase airspace utilization as well as fuel efficiency. This presentation addresses two main problems in multi-agent formations: optimal role assignment to minimize the total cost (e.g., combined distance traveled by all agents); and maintaining formation geometry during flock motion. The Kuhn-Munkres ("Hungarian") algorithm is used for optimal assignment, and consensus-based leader-follower type control architecture is used to maintain formation shape despite the leader s independent movements. The methods are demonstrated by animated simulations.

Full Text Available The Danube Regions, especially the sub-national units of governance, must be ready to play an active role in spatial development policies. A precondition for this is good accessibility and the coordinated development of all transport systems in the Danube corridor. The main contribution of this paper is to provide a multi-criteria model for potential decision making related to the evaluation of transportation accessibility in Serbia’s Danube Corridor. Geographic Information Systems (GIS, based on maps, indicate the existing counties’ transport infrastructures inequities (between well-connected and isolated counties in terms of accessibility to central places. Through the research, relevant indicators have been identifi ed. This provides an outline of transportation perspectives regarding the development achieved and also fosters the increase of transportation accessibility in some peripheral Serbian Danube administrative units – counties (Nomenclature of Territorial Units for Statistics level 3 – NUTS 3.

Full Text Available The next generation surveillance and multimedia systems will become increasingly deployed as wireless sensor networks in order to monitor parks, public places and for business usage. The convergence of data and telecommunication over IP-based networks has paved the way for wireless networks. Functions are becoming more intertwined by the compelling force of innovation and technology. For example, many closed-circuit TV premises surveillance systems now rely on transmitting their images and data over IP networks instead of standalone video circuits. These systems will increase their reliability in the future on wireless networks and on IEEE 802.11 networks. However, due to limited non-overlapping channels, delay, and congestion there will be problems at sink nodes. In this paper we provide necessary conditions to verify the feasibility of round robin technique in these networks at the sink nodes by using a technique to regulate multi-radio multichannel assignment. We demonstrate through simulations that dynamic channel assignment scheme using multi-radio, and multichannel configuration at a single sink node can perform close to optimal on the average while multiple sink node assignment also performs well. The methods proposed in this paper can be a valuable tool for network designers in planning network deployment and for optimizing different performance objectives.

The next generation surveillance and multimedia systems will become increasingly deployed as wireless sensor networks in order to monitor parks, public places and for business usage. The convergence of data and telecommunication over IP-based networks has paved the way for wireless networks. Functions are becoming more intertwined by the compelling force of innovation and technology. For example, many closed-circuit TV premises surveillance systems now rely on transmitting their images and data over IP networks instead of standalone video circuits. These systems will increase their reliability in the future on wireless networks and on IEEE 802.11 networks. However, due to limited non-overlapping channels, delay, and congestion there will be problems at sink nodes. In this paper we provide necessary conditions to verify the feasibility of round robin technique in these networks at the sink nodes by using a technique to regulate multi-radio multichannel assignment. We demonstrate through simulations that dynamic channel assignment scheme using multi-radio, and multichannel configuration at a single sink node can perform close to optimal on the average while multiple sink node assignment also performs well. The methods proposed in this paper can be a valuable tool for network designers in planning network deployment and for optimizing different performance objectives.

This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.

The security challenges being encountered in many places today require electronic means of controlling access to secured premises in addition to the available security personnel. Various technologies were used in different forms to solve these challenges. The Radio Frequency Identification (RFID) Based Access Control Security system with GSM technology presented in this work helps to prevent unauthorized access to controlled environments (secured premises). This is achieved mainly...

The wide spread use of advanced information systems such as Material Requirements Planning (MRP) has significantly altered the practice of dependent demand inventory management. Recent research has focused on development of multi-level lot sizing heuristics for such systems. In this paper, we develop an optimal procedure for the multi-period, multi-product, multi-level lot sizing problem by modeling the system as a constrained generalized network with fixed charge arcs and side constraints. T...

e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

A multi-fiction corrosion monitoring system has been designed for installation into DST 241-AN-105 at the Hanford Site in fiscal year 1999. The 241-AN-105 system is the third-generation corrosion monitoring system described by TTP RLO-8-WT-21. Improvements and upgrades from the second-generation system (installed in 241-AN-102) that have been incorporated into the third-generation system include: Gasket seating surfaces utilize O-rings instead of a washer type gasket for improved seal; Probe design contains an equally spaced array of 22 thermocouples; Probe design contains an adjustable verification thermocouple; Probe design contains three ports for pressure/gas sampling; Probe design contains one set of strain gauges to monitor probe flexure if flexure occurs; Probe utilizes an adjustable collar to allow depth adjustment of probe during installation; System is capable of periodically conducting LPR scans; System is housed in a climate controlled enclosure adjacent to the riser containing the probe; System uses wireless Ethernet links to send data to Hanford Local Area Network; System uses commercial remote access software to allow remote command and control; and Above ground wiring uses driven shields to reduce external electrostatic noise in the data. These new design features have transformed what was primarily a second-generation corrosion monitoring system into a multi-function tank monitoring system that adds a great deal of functionality to the probe, provides for a better understanding of the relationship between corrosion and other tank operating parameters, and optimizes the use of the riser that houses the probe in the tank

The complexity of the ATLAS experiment motivated the deployment of an integrated Access Control System in order to guarantee safe and optimal access for a large number of users to the various software and hardware resources. Such an integrated system was foreseen since the design of the infrastructure and is now central to the operations model. In order to cope with the ever growing needs of restricting access to all resources used within the experiment, the Roles Based Access Control (RBAC) previously developed has been extended and improved. The paper starts with a short presentation of the RBAC design, implementation and the changes made to the system to allow the management and usage of roles to control access to the vast and diverse set of resources. The RBAC implementation uses a directory service based on Lightweight Directory Access Protocol to store the users (∼3000), roles (∼320), groups (∼80) and access policies. The information is kept in sync with various other databases and directory services: human resources, central CERN IT, CERN Active Directory and the Access Control Database used by DCS. The paper concludes with a detailed description of the integration across all areas of the system.

A horizontal multi-purpose microbeam system with a single electrostatic quadruplet focusing lens has been developed at the Columbia University Radiological Research Accelerator Facility (RARAF). It is coupled with the RARAF 5.5 MV Singleton accelerator (High Voltage Engineering Europa, the Netherlands) and provides micrometer-size beam for single cell irradiation experiments. It is also used as the primary beam for a neutron microbeam and microPIXE (particle induced x-ray emission) experiment because of its high particle fluence. The optimization of this microbeam has been investigated with ray tracing simulations and the beam spot size has been verified by different measurements.

Computational thermodynamic, also known as the Calphad method, is a standard tool in industry for the development of materials and improving processes and there is an intense scientific development of new models and databases. The calculations are based on thermodynamic models of the Gibbs energy for each phase as a function of temperature, pressure and constitution. Model parameters are stored in databases that are developed in an international scientific collaboration. In this way, consistent and reliable data for many properties like heat capacity, chemical potentials, solubilities etc. can be obtained for multi-component systems. A brief introduction to this technique is given here and references to more extensive documentation are provided. (authors)

Full Text Available The Southwest Research Institute (SwRI Mobile Autonomous Robotics Technology Initiative (MARTI program has enabled the development of fully-autonomous passenger-sized commercial vehicles and military tactical vehicles, as well as the development of cooperative vehicle behaviors, such as cooperative sensor sharing and cooperative convoy operations. The program has also developed behaviors to interface intelligent vehicles with intelligent road-side devices. The development of intelligent vehicle behaviors cannot be approached as stand-alone phenomena; rather, they must be understood within a context of the broader traffic system dynamics. The study of other complex systems has shown that system-level behaviors emerge as a result of the spatio-temporal dynamics within a system's constituent parts. The design of such systems must therefore account for both the system-level emergent behavior, as well as behaviors of individuals within the system. It has also become clear over the past several years, for both of these domains, that human trust in the behavior of individual vehicles is paramount to broader technology adoption. This paper examines the interplay between individual vehicle capabilities, vehicle connectivity, and emergent system behaviors, and presents some considerations for a distributed control paradigm in a multi-vehicle system.

The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

Full Text Available Most universities are already implementing wired and wireless network that is used to access integrated information systems and the Internet. At present it is important to do research on the influence of the broadcasting system through the access point for video transmitter learning in the university area. At every university computer network through the access point must also use the cable in its implementation. These networks require cables that will connect and transmit data from one computer to another computer. While wireless networks of computers connected through radio waves. This research will be a test or assessment of how the influence of the network using the WLAN access point for video broadcasting means learning from the server to the client. Instructional video broadcasting from the server to the client via the access point will be used for video broadcasting means of learning. This study aims to understand how to build a wireless network by using an access point. It also builds a computer server as instructional videos supporting software that can be used for video server that will be emitted by broadcasting via the access point and establish a system of transmitting video from the server to the client via the access point.

A timing system has been constructed for multi-bunch/multi-train operation at KEK-ATF. The linac accelerates 20 bunches of multi-bunch with 2.8 ns spacing. The Damping Ring stores up to 5 trains of multi-bunch. The timing system is required to provide flexible operation mode and bucket selection. A personal computer is used for manipulating the timing. The performance of kicker magnets at the injection/extruction is key issue for multi-train operation. The hardware and the test results are presented. (author)

...) system as part of the DoD Paper-Free Contracting Initiative. EDA contributes to the initiative by digitizing paper documents and offering web-based read-only access to official contracting, finance and accounting documents...

Several kinds of computer systems are used to perform large helical device (LHD) experiments, and each produces its own data format. Therefore, it has been difficult to deal with these data simultaneously. In order to solve this problem, the Kaiseki server was developed; it has been facilitating the unified retrieval of LHD data. The data acquired or analyzed by various computer systems are converted into the unified ASCII format, or Kaiseki format, and transferred to the Kaiseki server. With this method, the researchers can visualize and analyze the data produced by various kinds of computers in the same way. Because validations are needed before registering on the Kaiseki server, it takes time to make the validated data available. However, some researchers need data as soon as it is gathered in order to adjust their instruments during the experiments. To satisfy this requirement, a new visualization system has been under development. The new system has two ways to visualize the data as physical values from the raw data. If the conversion task is not complex, the NIFSscope, a visualization tool, converts the raw data into physics data by itself. If the task is too complex to handle, it asks the ANACalc server to make physics data. When the ANACalc server receives a request, it delegates calculation programs to convert the acquired data into physics data. Because the interfaces between the server and the calculation processes are independent of programming languages and operating systems, the calculation processes can be placed on different computers and the server load can be reduced. Therefore, the system can respond to changes in requirements by replacing the calculation programs, and can easily be expanded by increasing the number of calculation servers

Multi-scale biomedical systems are those that represent interactions in materials, sensors, and systems from a holistic perspective. It is possible to view such multi-scale activity using measurement of spatial scale or time scale, though in this paper only the former is considered. The biomedical application paradigm comprises interactions that range from quantum biological phenomena at scales of 10-12 for one individual to epidemiological studies of disease spread in populations that in a pandemic lead to measurement at a scale of 10+7. It is clear that there are measurement challenges at either end of this spatial scale, but those challenges that relate to the use of new technologies that deal with big data and health service delivery at the point of care are also considered. The measurement challenges lead to the use, in many cases, of model-based measurement and the adoption of virtual engineering. It is these measurement challenges that will be uncovered in this paper. (paper)

Data Aggregation (DA) is a set of functions that provide components of a distributed systemaccess to global information for purposes of network management and user services. With the diverse new capabilities that networks can provide, applicability of DA is growing. DA is useful in dealing with multi-value domain information and often requires…

Recently, in many countries, electric utility industry is undergoing considerable changes in regard to its structure and regulation. It can be clearly seen that the thrust towards privatization and deregulation or re regulation of the electric utility industry will introduce numerous reliability problems that will require new criteria and analytical tools that recognize the residual uncertainties in the new environment. In this paper, different risks and uncertainties in competitive electricity markets are briefly introduced; the approach of customers, operators, planners, generation bodies and network providers to the reliability of deregulated system is studied; the impact of dispersed generation on system reliability is evaluated; and finally, the reliability cost/reliability worth issues in the new competitive environment are considered

The ever increasing number of vehicles in most metropolitan cities around the world and the limitation in altering the transportation infrastructure, led to serious traffic congestion and an increase in the travelling time. In this work we exploit the emergence of novel technologies such as the internet, to design an intelligent Traffic Management System (TMS) that can remotely monitor and control a network of traffic light controllers located at different sites. The system is based on utilizing Embedded Web Servers (EWS) technology to design a web-based TMS. The EWS located at each intersection uses IP technology for communicating remotely with a Central Traffic Management Unit (CTMU) located at the traffic department authority. Friendly GUI software installed at the CTMU will be able to monitor the sequence of operation of the traffic lights and the presence of traffic at each intersection as well as remotely controlling the operation of the signals. The system has been validated by constructing a prototype that resembles the real application.

Plan Sésame (PS) is a user fee exemption policy launched in 2006 to provide free access to health services to Senegalese citizens aged 60 and over. Analysis of a large household survey evaluating PS echoes findings of other studies showing that user fee removal can be highly inequitable. 34 semi-structured interviews and 19 focus group discussions with people aged 60 and over were conducted in four regions in Senegal (Dakar, Diourbel, Matam and Tambacounda) over a period of six months during 2012. They were analysed to identify underlying causes of exclusion from/inclusion in PS and triangulated with the household survey. The results point to three steps at which exclusion occurs: (i) not being informed about PS; (ii) not perceiving a need to use health services under PS; and (iii) inability to access health services under PS, despite having the information and perceived need. We identify lay explanations for exclusion at these different steps. Some lay explanations point to social exclusion, defined as unequal power relations. For example, poor access to PS was seen to be caused by corruption, patronage, poverty, lack of social support, internalised discrimination and adverse incorporation. Other lay explanations do not point to social exclusion, for example: poor implementation; inadequate funding; high population demand; incompetent bureaucracy; and PS as a favour or moral obligation to friends or family. Within a critical realist paradigm, we interpret these lay explanations as empirical evidence for the presence of the following hidden underlying causal mechanisms: lacking capabilities; mobilisation of institutional bias; and social closure. However, social constructionist perspectives lead us to critique this paradigm by drawing attention to contested health, wellbeing and corruption discourses. These differences in interpretation lead to subsequent differential policy recommendations. This demonstrates the need for the adoption of a "multi

The U.S. Global Precipitation Measurement mission (GPM) team has developed the Integrated Multi-satellitE Retrievals for GPM (IMERG) algorithm to take advantage of the international constellation of precipitation-relevant satellites and the Global Precipitation Climatology Centre surface precipitation gauge analysis. The goal is to provide a long record of homogeneous, high-resolution quasi-global estimates of precipitation. While expert scientific researchers are major users of the IMERG products, it is clear that many other user communities and disciplines also desire access to the data for wide-ranging applications. Lessons learned during the Tropical Rainfall Measuring Mission, the predecessor to GPM, led to some basic design choices that provided the framework for supporting multiple user bases. For example, two near-real-time "runs" are computed, the Early and Late (currently 5 and 15 hours after observation time, respectively), then the Final Run about 3 months later. The datasets contain multiple fields that provide insight into the computation of the complete precipitation data field, as well as diagnostic (currently) estimates of the precipitation's phase. In parallel with this, the archive sites are working to provide the IMERG data in a variety of formats, and with subsetting and simple interactive analysis to make the data more easily available to non-expert users. The various options for accessing the data are summarized under the pmm.nasa.gov data access page. The talk will end by considering the feasibility of major user requests, including polar coverage, a simplified Data Quality Index, and reduced data latency for the Early Run. In brief, the first two are challenging, but under the team's control. The last requires significant action by some of the satellite data providers.

This paper studies the multi-target consensus pursuit problem of multi-agent systems. For solving the problem, a distributed multi-flocking method is designed based on the partial information exchange, which is employed to realise the pursuit of multi-target and the uniform distribution of the number of pursuing agents with the dynamic target. Combining with the proposed circle formation control strategy, agents can adaptively choose the target to form the different circle formation groups accomplishing a multi-target pursuit. The speed state of pursuing agents in each group converges to the same value. A Lyapunov approach is utilised to analyse the stability of multi-agent systems. In addition, a sufficient condition is given for achieving the dynamic target consensus pursuit, and which is then analysed. Finally, simulation results verify the effectiveness of the proposed approaches.

Microprocessors were installed as auxiliary crate controllers (ACCs) in the CAMAC interface of control systems for various accelerators. The same ACC was also at the hearth of a stand-alone system in the form of a mobile console. This was also used for local access to the control systems for tests and development work (Annual Report 1981, p. 80, Fig. 10).

A prototype of the COROT ground-based archive and accesssystem is presented here. The system has been developed at LAEFF and it is based on the experience gained at Laboratorio de Astrofisica Espacial y Fisica Fundamental (LAEFF) with the INES (IUE Newly Extracted System) Archive.

The ORIGEN-S code in the SCALE 6.0 nuclear analysis code suite is a well-validated tool to calculate the time-dependent concentrations of nuclides due to isotopic depletion, decay, and transmutation for many systems in a wide range of time scales. Application areas include nuclear reactor and spent fuel storage analyses, burnup credit evaluations, decay heat calculations, and environmental assessments. Although simple to use within the SCALE 6.0 code system, especially with the ORIGEN-ARP graphical user interface, it is generally complex to use as a component within an externally developed code suite because of its tight coupling within the infrastructure of the larger SCALE 6.0 system. The ORIGEN2 code, which has been widely integrated within other simulation suites, is no longer maintained by Oak Ridge National Laboratory (ORNL), has obsolete data, and has a relatively small validation database. Therefore, a modular version of the SCALE/ORIGEN-S code was developed to simplify its integration with other software packages to allow multi-physics nuclear code systems to easily incorporate the well-validated isotopic depletion, decay, and transmutation capability to perform realistic nuclear reactor and fuel simulations. SCALE/ORIGEN-S was extensively restructured to develop a modular version that allows direct access to the matrix solvers embedded in the code. Problem initialization and the solver were segregated to provide a simple application program interface and fewer input/output operations for the multi-physics nuclear code systems. Furthermore, new interfaces were implemented to access and modify the ORIGEN-S input variables and nuclear cross-section data through external drivers. Three example drivers were implemented, in the C, C++, and Fortran 90 programming languages, to demonstrate the modular use of the new capability. This modular version of SCALE/ORIGEN-S has been embedded within several multi-physics software development projects at ORNL, including

The ORIGEN-S code in the SCALE 6.0 nuclear analysis code suite is a well-validated tool to calculate the time-dependent concentrations of nuclides due to isotopic depletion, decay, and transmutation for many systems in a wide range of time scales. Application areas include nuclear reactor and spent fuel storage analyses, burnup credit evaluations, decay heat calculations, and environmental assessments. Although simple to use within the SCALE 6.0 code system, especially with the ORIGEN-ARP graphical user interface, it is generally complex to use as a component within an externally developed code suite because of its tight coupling within the infrastructure of the larger SCALE 6.0 system. The ORIGEN2 code, which has been widely integrated within other simulation suites, is no longer maintained by Oak Ridge National Laboratory (ORNL), has obsolete data, and has a relatively small validation database. Therefore, a modular version of the SCALE/ORIGEN-S code was developed to simplify its integration with other software packages to allow multi-physics nuclear code systems to easily incorporate the well-validated isotopic depletion, decay, and transmutation capability to perform realistic nuclear reactor and fuel simulations. SCALE/ORIGEN-S was extensively restructured to develop a modular version that allows direct access to the matrix solvers embedded in the code. Problem initialization and the solver were segregated to provide a simple application program interface and fewer input/output operations for the multi-physics nuclear code systems. Furthermore, new interfaces were implemented to access and modify the ORIGEN-S input variables and nuclear cross-section data through external drivers. Three example drivers were implemented, in the C, C++, and Fortran 90 programming languages, to demonstrate the modular use of the new capability. This modular version of SCALE/ORIGEN-S has been embedded within several multi-physics software development projects at ORNL, including

The LHC Access Safety System has introduced a number of new concepts into the domain of personnel protection at CERN. These can be grouped into several categories: organisational, architectural and concerning the end-user experience. By anchoring the project on the solid foundations of the IEC 61508/61511 methodology, the CERN team and its contractors managed to design, develop, test and commission on time a SIL3 safety system. The system uses a successful combination of the latest Siemens redundant safety programmable logic controllers with a traditional relay logic hard wired loop. The external envelope barriers used in the LHC include personnel and material access devices, which are interlocked door-booths introducing increased automation of individual access control, thus removing the strain from the operators. These devices ensure the inviolability of the controlled zones by users not holding the required credentials. To this end they are equipped with personnel presence detectors and the access control includes a state of the art bio-metry check. Building on the LHC experience, new projects targeting the refurbishment of the existing access safety infrastructure in the injector chain have started. This paper summarises the new concepts introduced in the LHC access control and safety systems, discusses the return of experience and outlines the main guiding principles for the renewal stage of the personnel protection systems in the LHC injector chain in a homogeneous manner. (authors)

The multi-pulse frequency shifted technique uses mutually orthogonal short duration pulses o transmit and receive information in a UWB multiuser communication system. The multiuser system uses the same pulse shape with different frequencies for the reference and data for each user. Different users have a different pulse shape (mutually orthogonal to each other) and different transmit and reference frequencies. At the receiver, the reference pulse is frequency shifted to match the data pulse and a correlation scheme followed by a hard decision block detects the data.

The Automatic Identification System (AIS) is a maritime equipment to allow an efficient exchange of the navigational data between ships and between ships and shore stations. It utilizes a channel access algorithm which can quickly resolve conflicts without any intervention from control stations. In this paper, a design of channel access algorithm for the AIS is presented. The input/output relationship of each access algorithm module is defined by drawing the state transition diagram, dataflow diagram and flowchart based on the technical standard, ITU-R M.1371. In order to verify the designed channel access algorithm, the simulator was developed using the C/C++ programming language. The results show that the proposed channel access algorithm can properly allocate transmission slots and meet the operational performance requirements specified by the technical standard.

A concept study was undertaken to evaluate potential multi-megawatt power sources for nuclear electric propulsion. The nominal electric power requirement was set at 15 MWe with an assumed mission profile of 120 days at full power, 60 days in hot standby, and another 120 days of full power, repeated several times for 7 years of service. Two configurations examined were (1) a gas-cooled reactor based on the NERVA Derivative design, operating a closed cycle Brayton power conversion system; and (2) a molten metal-cooled reactor based on SP-100 technology, driving a boiling potassium Rankine power conversion system. This study considered the relative merits of these two systems, seeking to optimize the specific mass. Conclusions were that either concept appeared capable of reaching the specific mass goal of 3-5 kg/kWe estimated to be needed for this class of mission, though neither could be realized without substantial development in reactor fuels technology, thermal radiator mass and volume efficiency, and power conversion and distribution electronics and systems capable of operating at high temperatures. The gas-Brayton system showed a specific mass advantage (3.17 vs 6.43 kg/kWe for the baseline cases) under the set of assumptions used and eliminated the need to deal with two-phase working fluid flows in the microgravity environment of space. .

The complexity of the ATLAS experiment motivated the deployment of an integrated Access Control System in order to guarantee safe and optimal access for a large number of users to the various software and hardware resources. Such an integrated system was foreseen since the design of the infrastructure and is now central to the operations model. In order to cope with the ever growing needs of restricting access to all resources used within the experiment, the Roles Based Access Control (RBAC) previously developed has been extended and improved. The paper starts with a short presentation of the RBAC design, implementation and the changes made to the system to allow the management and usage of roles to control access to the vast and diverse set of resources. The paper continues with a detailed description of the integration across all areas of the system: local Linux and Windows nodes in the ATLAS Control Network (ATCN), the Linux application gateways offering remote access inside ATCN, the Windows Terminal Serv...

The complexity of the ATLAS experiment motivated the deployment of an integrated Access Control System in order to guarantee safe and optimal access for a large number of users to the various software and hardware resources. Such an integrated system was foreseen since the design of the infrastructure and is now central to the operations model. In order to cope with the ever growing needs of restricting access to all resources used within the experiment, the Roles Based Access Control (RBAC) previously developed has been extended and improved. The paper starts with a short presentation of the RBAC design, implementation and the changes made to the system to allow the management and usage of roles to control access to the vast and diverse set of resources. The paper continues with a detailed description of the integration across all areas of the system: local Linux and Windows nodes in the ATLAS Control Network (ATCN), the Linux application gateways offering remote access inside ATCN, the Windows Terminal Serv...

The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching

Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) tactical battlefield systems must provide the right information and resources to the right individuals at the right time. At the same time, the C4ISR system must enforce access controls to prevent the wrong individuals from obtaining sensitive information, or consuming scarce resources. Because lives, missions and property depend upon them, these access control mechanisms must be effective, reliable, efficient and flexible. The mechanisms employed must suit the nature of the items that are to be protected, as well as the varieties of access policies that must be enforced, and the types of access that will be made to these items. Some access control technologies are inherently centralized, while others are suitable for distributed implementation. The C4ISR architect must select from among the available technologies a combination of mechanisms that eases the burden of policy administration, but is inherently survivable, accurate, resource efficient, and which provides low latency. This paper explores various alternative access enforcement mechanisms, and assesses their effectiveness in managing policy-driven access control within the battlespace.

Quality situation assessment and decision making require access to multiple sources of data and information. Insufficient accessibility to data exists for many large corporations and Government agencies. By utilizing current advances in computer technology, today's situation analyst's have a wealth of information at their disposal. There are many potential solutions to the information accessibility problem using today's technology. The United States Department of Energy (US-DOE) faced this problem when dealing with one class of problem in the US. The result of their efforts has been the creation of the Tank Waste Information Network System -- TWINS. The TWINS solution combines many technologies to address problems in several areas such as User Interfaces, Transparent Access to Multiple Data Sources, and Integrated Data Access. Data related to the complex is currently distributed throughout several US-DOE installations. Over time, each installation has adopted their own set of standards as related to information management. Heterogeneous hardware and software platforms exist both across the complex and within a single installation. Standards for information management vary between US-DOE mission areas within installations. These factors contribute to the complexity of accessing information in a manner that enhances the performance and decision making process of the analysts. This paper presents one approach taken by the DOE to resolve the problem of distributed, heterogeneous, multi-media information management for the HLW Tank complex. The information system architecture developed for the DOE by the TWINS effort is one that is adaptable to other problem domains and uses

This paper proposes an efficient medium access control (MAC) protocol based on multifrequency-time division multiple access (MF-TDMA) for geostationary satellite systems deploying multiple spot-beams and onboard processing,which uses a method of random reservation access with movable boundaries to dynamically request the transmission slots and can transmit different types of traffic. The simulation results have shown that our designed MAC protocol can achieve a high bandwidth utilization, while providing the required quality of service (QoS) for each class of service.

The LAMPF control system data access software offers considerable power and flexibility to application programs through symbolic device naming and an emphasis on hardware independence. This paper discusses optimizations aimed at improving the performance of the data access software while retaining these capabilities. The only aspects of the optimizations visible to the application programs are ''vector devices'' and ''aggregate devices.'' A vector device accesses a set of hardware related data items through a single device name. Aggregate devices allow run-time optimization of references to groups of unrelated devices. Optimizations not visible on the application level include careful handling of: network message traffic; the sharing of global resources; and storage allocation

The invention is a multi-sensor radiation detection system including a self-powered detector and an ion or fission chamber, preferably joined as a unitary structure, for removable insertion into a nuclear reactor. The detector and chamber are connected electrically in parallel, requiring but two conductors extending out of the reactor to external electrical circuitry which includes a load impedance, a voltage source, and switch means. The switch means are employed to alternately connect the detector and chamber either with th load impedance or with the load impedance and the voltage source. In the former orientation, current through the load impedance indicates flux intensity at the self-powered detector and in the latter orientation, the current indicates flux intensity at the detector and fission chamber, though almost all of the current is contributed by the fission chamber. (auth)

The Magnetospheric MultiScale (MMS) mission is an ambitious NASA space science mission in which 4 spacecraft are flown in tight formation about a highly elliptical orbit. Each spacecraft has multiple instruments that measure particle and field compositions in the Earths magnetosphere. By controlling the members relative motion, MMS can distinguish temporal and spatial fluctuations in a way that a single spacecraft cannot.To achieve this control, 2 sets of four maneuvers, distributed evenly across the spacecraft must be performed approximately every 14 days. Performing a single maneuver on an individual spacecraft is usually labor intensive and the complexity becomes clearly increases with four. As a result, the MMS flight dynamics team turned to the System Manager to put the routine or error-prone under machine control freeing the analysts for activities that require human judgment.The System Manager is an expert system that is capable of handling operations activities associated with performing MMS maneuvers. As an expert system, it can work off a known schedule, launching jobs based on a one-time occurrence or on a set reoccurring schedule. It is also able to detect situational changes and use event-driven programming to change schedules, adapt activities, or call for help.

Currently, agents are focus of intense on many sub-fields of computer science and artificial intelligence. Agents are being used in an increasingly wide variety of applications. Many important computing applications such as planning, process control, communication networks and concurrent systems will benefit from using multi-agent system approach. A multi-agent system is a structure given by an environment together with a set of artificial agents capable to act on this environment. Multi-agent models are oriented towards interactions, collaborative phenomena, and autonomy. This article presents the applications of multi-agent technology to the power systems.

The numerous reforms to the Convention system of the past two decades have unquestionably had an effect on applicants’ means to access justice in the system. It is, however, open to question how these changes should be evaluated: with reference to the individual right to petition, or with reference

Based on an evaluation of the current commercial Radiation Worker Access Control Software Systems, Baltimore Gas and Electric Company has elected to design and develop a site specific access control and accountability system for the Calvert Cliffs Nuclear Power Plant. The vendor provided systems allow for radiation worker access control based on training and external exposure records and authorizations. These systems do not afford internal exposure control until after bioassay measurements or maximum permissible concentration-hours are tabulated. The vendor provided systems allow for data trending for ALARA purposes, but each software package must be modified to meet site specific requirements. Unlike the commercial systems, the Calvert Cliffs Radiological Controls and Accountability System (RCAS) will provide radiation worker exposure control, both internal and external. The RCAS is designed to fulfill the requirements by integrating the existing Radiation Safety, Dosemetry, and Training data bases with a comprehensive radiological surveillance program. Prior to each worker's entry into the Radiological Control Area; his training and qualifications, radiation exposure history and authorization, will be compared with administrative controls, such as radiation work permits, and respiratory protection requirements and the radiological conditions in the work area. The RCAS, a computer based applied health physics access control system is described as it is presently configured for development. The mechanisms for enhancing worker internal and external exposure controls are discussed. Proposed data application to both the Calvert Cliffs ALARA and outage planning programs is included

California Community Colleges, Sacramento. High-Tech Center for the Disabled.

This document provides information on the integration of assistive computer technologies and library automation systems at California Community Colleges in order to ensure access for students with disabilities. Topics covered include planning, upgrading, purchasing, implementing and using these technologies with library systems. As information…

This document presents the Assessment of Deafblind Access to Manual Language Systems (ADAMLS), a resource for educational teams who are responsible for developing appropriate adaptations and strategies for children who are deafblind who are candidates for learning manual language systems. The assessment tool should be used for all children with a…

The architecture design and realization of a data acquisition system for multi-channel CT is described. The article introduces the conversion of analog signal to digital signal, the data cache and transmission. This data acquisition system can be widely used in the system which requires the multi-channel, weak current signal detection. (authors)

In July 2012, protestors cut through security fences and gained access to the Y-12 National Security Complex. This was believed to be a highly reliable, multi-layered security system. This report documents the results of a Laboratory Directed Research and Development (LDRD) project that created a consistent, robust mathematical framework using complex systems analysis algorithms and techniques to better understand the emergent behavior, vulnerabilities and resiliency of multi-layered security systems subject to budget constraints and competing security priorities. Because there are several dimensions to security system performance and a range of attacks that might occur, the framework is multi-objective for a performance frontier to be estimated. This research explicitly uses probability of intruder interruption given detection (PI) as the primary resilience metric. We demonstrate the utility of this framework with both notional as well as real-world examples of Physical Protection Systems (PPSs) and validate using a well-established force-on-force simulation tool, Umbra.

NILE is a multi-disciplinary project building a distributed computing environment for HEP. It provides wide-area, fault-tolerant, integrated access to processing and data resources for collaborators of the CLEO experiment, though the goals and principles are applicable to many domains. NILE has three main objectives: a realistic distributed system architecture design, the design of a robust data model, and a Fast-Track implementation providing a prototype design environment which will also be used by CLEO physicists. This paper focuses on the software and wide-area system architecture design and the computing issues involved in making NILE services highly-available. (author)

Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

Highlights: • Database design needs to be optimized and highly efficient for real time operation. • It requires a many-to-many mapping between Employee table and Doors table. • This mapping typically contain thousands of records and redundant data. • Proposed novel database design reduces the redundancy and provides abstraction. • This design is incorporated with the access control system developed in-house. - Abstract: A (Radio Frequency IDentification) RFID cum Biometric based two level Access Control System (ACS) was designed and developed for providing access to vital areas of nuclear facilities. The system has got both hardware [Access controller] and software components [server application, the database and the web client software]. The database design proposed, enables grouping of the employees based on the hierarchy of the organization and the grouping of the doors based on Access Zones (AZ). This design also illustrates the mapping between the Employee Groups (EG) and AZ. By following this approach in database design, a higher level view can be presented to the system administrator abstracting the inner details of the individual entities and doors. This paper describes the novel approach carried out in designing the database of the ACS.

The LHC Access Safety System has introduced a number of new concepts into the domain of personnel protection at CERN. These can be grouped into several categories: organisational, architectural and concerning the end-user experience. By anchoring the project on the solid foundations of the IEC 61508/61511 methodology, the CERN team and its contractors managed to design, develop, test and commission on time a SIL3 safety system. The system uses a successful combination of the latest Siemens redundant safety programmable logic controllers with a traditional relay logic hardwired loop. The external envelope barriers used in the LHC include personnel and material access devices, which are interlocked door-booths introducing increased automation of individual access control, thus removing the strain from the operators. These devices ensure the inviolability of the controlled zones by users not holding the required credentials. To this end they are equipped with personnel presence detectors and th...

Cache-enabled base station (BS) densification, denoted as a fog radio access network (F-RAN), is foreseen as a key component of 5G cellular networks. F-RAN enables storing popular files at the network edge (i.e., BS caches), which empowers local communication and alleviates traffic congestions at the core/backhaul network. The hitting probability, which is the probability of successfully transmitting popular files request from the network edge, is a fundamental key performance indicator (KPI) for F-RAN. This paper develops a scheduling aware mathematical framework, based on stochastic geometry, to characterize the hitting probability of F-RAN in a multi-channel environment. To this end, we assess and compare the performance of two caching distribution schemes, namely, uniform caching and Zipf caching. The numerical results show that the commonly used single channel environment leads to pessimistic assessment for the hitting probability of F-RAN. Furthermore, the numerical results manifest the superiority of the Zipf caching scheme and quantify the hitting probability gains in terms of the number of channels and cache size.

The article focuses on multi-agent systems (MAS) and domains that can benefit from multi-agent technology. In the last few years, the agent based modeling (ABM) community has developed several practical agent based modeling toolkits that enable individuals to develop agent-based applications. The comparison of agent-based modeling toolkits is given. Multi-agent systems are designed to handle changing and dynamic business processes. Any organization with complex and distributed business pro...

The Electrocardiogram ECG is one of the most important non-invasive tools for cardiac diseases diagnosis. Taking advantage of the developed telecommunication infrastructure, several approaches that address the development of telemetry cardiac devices were introduced recently. Telemetry ECG devices allow easy and fast ECG monitoring of patients with suspected cardiac issues. Choosing the right device with the desired working mode, signal quality, and the device cost are still the main obstacles to massive usage of these devices. In this paper, we introduce design, implementation, and validation of a multi-purpose telemetry system for recording, transmission, and interpretation of ECG signals in different recording modes. The system consists of an ECG device, a cloud-based analysis pipeline, and accompanied mobile applications for physicians and patients. The proposed ECG device's mechanical design allows laypersons to easily record post-event short-term ECG signals, using dry electrodes without any preparation. Moreover, patients can use the device to record long-term signals in loop and holter modes, using wet electrodes. In order to overcome the problem of signal quality fluctuation due to using different electrodes types and different placements on subject's chest, customized ECG signal processing and interpretation pipeline is presented for each working mode. We present the evaluation of the novel short-term recorder design. Recording of an ECG signal was performed for 391 patients using a standard 12-leads golden standard ECG and the proposed patient-activated short-term post-event recorder. In the validation phase, a sample of validation signals followed peer review process wherein two experts annotated the signals in terms of signal acceptability for diagnosis.We found that 96% of signals allow detecting arrhythmia and other signal's abnormal changes. Additionally, we compared and presented the correlation coefficient and the automatic QRS delineation results

We develop a Lattice Boltzmann code for computational fluid-dynamics and optimize it for massively parallel systems based on multi-core processors. Our code describes 2D multi-phase compressible flows. We analyze the performance bottlenecks that we find as we gradually expose a larger fraction of

This paper reviews the most important results on divergent multi-echelon systems. In particular, we concentrate on the interactions between the elements that constitute such a multi-echelon system, in order to determine several service measures (e.g. external customer service level and inventory

Indira Gandhi Centre for Atomic Research houses many laboratories which handle radioactive materials and classified materials. Protection and accounting of men and material and critical facilities are important aspect of nuclear security. Access Control System (ACS) is used to enhance the protective measures against elevated threat environment. Access control system hardware consists of hand geometry readers, RFID readers, Controllers, Electromagnetic door locks, Turnstiles, fiber cable laying and termination etc. Access Control System controls and monitors the people accessing the secured facilities. Access Control System generates events on: 1. Showing of RFID card, 2. Rotation of turnstile, 3. Download of valid card numbers, 4. Generation of alarms etc. Access control system turnstiles are located in main entrance of a facility, entrance of inside laboratory and door locks are fixed on secured facilities. Events are stored in SQL server database. From the events stored in database a novel technique is developed to extract events and list the persons in a particular facility, list all entry/exit events on one day, list the first in and last out entries. This paper discusses the complex multi level group by queries and software developed to extract events from database, locate persons and generate reports. Software is developed as a web application in ASP.Net and query is written in SQL. User can select the doors, type of events and generate reports. Reports are generated using the master data stored about employees RFID cards and events data stored in tables. Four types of reports are generated 1. Plant Emergency Report, 2. Locate User Report, 3. Entry - Exit Report, 4. First in Last out Report. To generate plant emergency report for whole plant only events generated in outer gates have to be considered. To generate plant emergency report for inside laboratory, events generated in entrance gates have to be ignored. (author)

Advances in high-throughput technologies have enabled extensive generation of multi-level omics data. These data are crucial for systems biology research, though they are complex, heterogeneous, highly dynamic, incomplete and distributed among public databases. This leads to difficulties in data accessibility and often results in errors when data are merged and integrated from varied resources. Therefore, integration and management of systems biological data remain very challenging. To overcome this, we designed and developed a dedicated database system that can serve and solve the vital issues in data management and hereby facilitate data integration, modeling and analysis in systems biology within a sole database. In addition, a yeast data repository was implemented as an integrated database environment which is operated by the database system. Two applications were implemented to demonstrate extensibility and utilization of the system. Both illustrate how the user can access the database via the web query function and implemented scripts. These scripts are specific for two sample cases: 1) Detecting the pheromone pathway in protein interaction networks; and 2) Finding metabolic reactions regulated by Snf1 kinase. In this study we present the design of database system which offers an extensible environment to efficiently capture the majority of biological entities and relations encountered in systems biology. Critical functions and control processes were designed and implemented to ensure consistent, efficient, secure and reliable transactions. The two sample cases on the yeast integrated data clearly demonstrate the value of a sole database environment for systems biology research.

With the rapid development of high speed networks, such as Ethernet Passive Optical Network (EPON), traffic patterns in access networks have evolved from traditional text-oriented service to the mixed text-, voice- and video- based services, leading to so called "Triple Play". For supporting IPTV service in EPON access network infrastructure, in this article we propose a novel IPTV program multiplex accesssystem to EPON, which enables multiple IPTV program source servers to seamlessly access to IPTV service access port of optical line terminal (OLT) in EPON. There are two multiplex schemes, namely static multiplex scheme and dynamic multiplex scheme, in implementing the program multiplexing. Static multiplex scheme is to multiplex all the IPTV programs and forward them to the OLT, regardless of the need of end-users. While dynamic multiplex scheme can dynamically multiplex and forward IPTV programs according to what the end-users actually demand and those watched by no end-user would not be multiplexed. By comparing these two schemes, a reduced traffic of EPON can be achieved by using dynamic multiplex scheme, especially when most end-users are watching the same few IPTV programs. Both schemes are implemented in our system, with their hardware and software designs described.

This monograph presents new algorithms for formation control of multi-agent systems (MAS) based on principles of continuum mechanics. Beginning with an overview of traditional methods, the author then introduces an innovative new approach whereby agents of an MAS are considered as particles in a continuum evolving in ℝn whose desired configuration is required to satisfy an admissible deformation function. The necessary theory and its validation on a mobile-agent-based swarm test bed are considered for two primary tasks: homogeneous transformation of the MAS and deployment of a random distribution of agents on a desired configuration. The framework for this model is based on homogeneous transformations for the evolution of an MAS under no inter-agent communication, local inter-agent communication, and intelligent perception by agents. Different communication protocols for MAS evolution, the robustness of tracking of a desired motion by an MAS evolving in ℝn, and the effect of communication delays in an MAS...

The emergence of Machine-to-Machine (M2M) communication requires new Medium Access Control (MAC) schemes and physical (PHY) layer concepts to support a massive number of access requests. The concept of coded random access, introduced recently, greatly outperforms other random access methods...... coded random access with CS-MUD on the PHY layer and show very promising results for the resulting protocol....

This paper reviews the main features of the new PS Personnel Protection System (PSPSS) as well as the main milestones for its deployment during the Long Shutdown of 2013-2014. Access conditions in the PS, SPS and LHC complexes during this period shall be described as well as the upgrades and improvements that are under preparation. (authors)

PubMed contains many articles in languages other than English but it is difficult to find them using the English version of the Medical Subject Headings (MeSH) Thesaurus. The aim of this work is to propose a tool allowing access to a PubMed subset in one language, and to evaluate its performance. Translations of MeSH were enriched and gathered in the information system. PubMed subsets in main European languages were also added in our database, using a dedicated parser. The CISMeF generic semantic search engine was evaluated on the response time for simple queries. MeSH descriptors are currently available in 11 languages in the information system. All the 654,000 PubMed citations in French were integrated into CISMeF database. None of the response times exceed the threshold defined for usability (2 seconds). It is now possible to freely access biomedical literature in French using a tool in French; health professionals and lay people with a low English language may find it useful. It will be expended to several European languages: German, Spanish, Norwegian and Portuguese.

The ARAC Client System allows users (such as emergency managers and first responders) with commonly available desktop and laptop computers to utilize the central ARAC system over the Internet or any other communications link using Internet protocols. Providing cost-effective fast access to the central ARAC system greatly expands the availability of the ARAC capability. The ARAC Client system consists of (1) local client applications running on the remote user's computer, and (2) ''site servers'' that provide secure access to selected central ARAC system capabilities and run on a scalable number of dedicated workstations residing at the central facility. The remote client applications allow users to describe a real or potential them-bio event, electronically sends this information to the central ARAC system which performs model calculations, and quickly receive and visualize the resulting graphical products. The site servers will support simultaneous access to ARAC capabilities by multiple users. The ARAC Client system is based on object-oriented client/server and distributed computing technologies using CORBA and Java, and consists of a large number of interacting components

This report summarizes Department of Energy (DOE) efforts to investigate various container systems for handling, transporting, storing, and disposing of spent nuclear fuel (SNF) assemblies in the Civilian Radioactive Waste Management System (CRWMS). The primary goal of DOE's investigations was to select a container technology that could handle the vast majority of commercial SNF at a reasonable cost, while ensuring the safety of the public and protecting the environment. Several alternative cask and canister concepts were evaluated for SNF assembly packaging to determine the most suitable concept. Of these alternatives, the multi-purpose canister (MPC) system was determined to be the most suitable. Based on the results of these evaluations, the decision was made to proceed with design and certification of the MPC system. A decision to fabricate and deploy MPCs will be made after further studies and preparation of an environmental impact statement

Traditional stand-alone computer-assisted surgery (CAS) systems impede the ubiquitous and simultaneous access by multiple users. With advances in computing and networking technologies, ubiquitous access to CAS systems becomes possible and promising. Based on our preliminary work, CASMIL, a stand-alone CAS server developed at Wayne State University, we propose a novel mobile CAS system, UbiCAS, which allows surgeons to retrieve, review and interpret multimodal medical images, and to perform some critical neurosurgical procedures on heterogeneous devices from anywhere at anytime. Furthermore, various optimization techniques, including caching, prefetching, pseudo-streaming-model, and compression, are used to guarantee the QoS of the UbiCAS system. UbiCAS enables doctors at remote locations to actively participate remote surgeries, share patient information in real time before, during, and after the surgery.

The Multi-Modal Traveler Information System (MMTIS) project involves a large number of Intelligent Transportation System (ITS) related tasks. It involves research of all ITS initiatives in the Gary-Chicago-Milwaukee (GCM) Corridor which are currently...

A low-NOx emission combustor concept has been developed for NASA's Environmentally Responsible Aircraft (ERA) program to meet N+2 emissions goals for a 70,000 lb thrust engine application. These goals include 75 percent reduction of LTO NOx from CAEP6 standards without increasing CO, UHC, or smoke from that of current state of the art. An additional key factor in this work is to improve lean combustion stability over that of previous work performed on similar technology in the early 2000s. The purpose of this paper is to present the final report for the NASA contract. This work included the design, analysis, and test of a multi-point combustion system. All design work was based on the results of Computational Fluid Dynamics modeling with the end results tested on a medium pressure combustion rig at the UC and a medium pressure combustion rig at GRC. The theories behind the designs, results of analysis, and experimental test data will be discussed in this report. The combustion system consists of five radially staged rows of injectors, where ten small scale injectors are used in place of a single traditional nozzle. Major accomplishments of the current work include the design of a Multipoint Lean Direct Injection (MLDI) array and associated air blast and pilot fuel injectors, which is expected to meet or exceed the goal of a 75 percent reduction in LTO NOx from CAEP6 standards. This design incorporates a reduced number of injectors over previous multipoint designs, simplified and lightweight components, and a very compact combustor section. Additional outcomes of the program are validation that the design of these combustion systems can be aided by the use of Computational Fluid Dynamics to predict and reduce emissions. Furthermore, the staging of fuel through the individually controlled radially staged injector rows successfully demonstrated improved low power operability as well as improvements in emissions over previous multipoint designs. Additional comparison

BACKGROUND: Laparoscopic colorectal surgery results in less post-operative pain, faster recovery, shorter length of stay and reduced morbidity compared with open procedures. Less or minimally invasive techniques have been developed to further minimise surgical trauma and to decrease the size and number of incisions. This study describes the safety and feasibility of using an umbilical multi-instrument access (MIA) port (Olympus TriPort+) device with the placement of just one 12-mm suprapubic trocar in laparoscopic (double-port) abdominoperineal resections (APRs) in rectal cancer patients. PATIENTS AND METHODS: The study included 20 patients undergoing double-port APRs for rectal cancer between June 2011 and August 2013. Preoperative data were gathered in a prospective database, and post-operative data were collected retrospectively. RESULTS: The 20 patients (30% female) had a median age of 67 years (range 46-80 years), and their median body mass index (BMI) was 26 kg/m2 (range 20-31 kg/m2). An additional third trocar was placed in 2 patients. No laparoscopic procedures were converted to an open procedure. Median operating time was 195 min (range 115-306 min). A radical resection (R0 resection) was achieved in all patients, with a median of 14 lymph nodes harvested. Median length of stay was 8 days (range 5-43 days). CONCLUSION: Laparoscopic APR using a MIA trocar is a feasible and safe procedure. A MIA port might be of benefit as an extra option in the toolbox of the laparoscopic surgeon to further minimise surgical trauma. PMID:27279397

Sudden Unexpected Death in Epilepsy (SUDEP) is the leading mode of epilepsy-related death and is most common in patients with intractable, frequent, and continuing seizures. A statistically significant cohort of patients for SUDEP study requires meticulous, prospective follow up of a large population that is at an elevated risk, best represented by the Epilepsy Monitoring Unit (EMU) patient population. Multiple EMUs need to collaborate, share data for building a larger cohort of potential SUDEP patient using a state-of-the-art informatics infrastructure. To address the challenges of data integration and data access from multiple EMUs, we developed the Multi-Modality Epilepsy Data Capture and Integration System (MEDCIS) that combines retrospective clinical free text processing using NLP, prospective structured data capture using an ontology-driven interface, interfaces for cohort search and signal visualization, all in a single integrated environment. A dedicated Epilepsy and Seizure Ontology (EpSO) has been used to streamline the user interfaces, enhance its usability, and enable mappings across distributed databases so that federated queries can be executed. MEDCIS contained 936 patient data sets from the EMUs of University Hospitals Case Medical Center (UH CMC) in Cleveland and Northwestern Memorial Hospital (NMH) in Chicago. Patients from UH CMC and NMH were stored in different databases and then federated through MEDCIS using EpSO and our mapping module. More than 77GB of multi-modal signal data were processed using the Cloudwave pipeline and made available for rendering through the web-interface. About 74% of the 40 open clinical questions of interest were answerable accurately using the EpSO-driven VISual AGregagator and Explorer (VISAGE) interface. Questions not directly answerable were either due to their inherent computational complexity, the unavailability of primary information, or the scope of concept that has been formulated in the existing Ep

Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions for determining the true response of propulsion systems. Results are presented for propulsion system responses including multi-discipline coupling effects via (1) coupled multi-discipline tailoring, (2) an integrated system of multidisciplinary simulators, (3) coupled material-behavior/fabrication-process tailoring, (4) sensitivities using a probabilistic simulator, and (5) coupled materials/structures/fracture/probabilistic behavior simulator. The results show that the best designs can be determined if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated interactive multi-discipline numerical propulsion system simulator.

Some time ago it was the lack of public access to medical research data that really stirred the issue and gave inertia for legislation and a new publishing model that puts tax payer-funded medical research in the hands of those who fund it. In today's age global climate change has become the biggest socio-economic challenge, and the same argument resonates: climate affects us all and the publicly-funded science quantifying it should be freely accessible to all stakeholders beyond academic research. Over the last few years the ‘Open Access' movement to remove as much as possible subscription, and other on-campus barriers to academic research has rapidly gathered pace, but despite significant progress, the climate system sciences are not among the leaders in providing full access to their publications and data. Beyond the ethical argument, there are proven and tangible benefits for the next generation of climate researchers to adapt the way their output is published. Through the means provided by ‘open access', both data and ideas can gain more visibility, use and citations for the authors, but also result in a more rapid exchange of knowledge and ideas, and ultimately progress towards a sought solution. The presentation will aim to stimulate discussion and seek progress on the following questions: Should free access to climate research (& data) be mandatory? What are the career benefits of using ‘open access' for young scientists? What means and methods should, or could, be incorporated into current European graduate training programmes in climate research, and possible ways forward?

Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited by factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.

Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited to factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.

Legacy buffer cache management schemes for multimedia server are grounded at the assumption that the application sequentially accesses the multimedia file. However, user access pattern may not be sequential in some circumstances, for example, in distance learning application, where the user may exploit the VCR-like function(rewind and play) of the system and accesses the particular segments of video repeatedly in the middle of sequential playback. Such a looping reference can cause a significant performance degradation of interval-based caching algorithms. And thus an appropriate buffer cache management scheme is required in order to deliver desirable performance even under the workload that exhibits looping reference behavior. We propose Adaptive Buffer cache Management(ABM) scheme which intelligently adapts to the file access characteristics. For each opened file, ABM applies either the LRU replacement or the interval-based caching depending on the Looping Reference Indicator, which indicates that how strong temporally localized access pattern is. According to our experiment, ABM exhibits better buffer cache miss ratio than interval-based caching or LRU, especially when the workload exhibits not only sequential but also looping reference property.

This study aims to design and test USV multislide forms. This system is excellent for maneuvering on the x-y-z coordinates. The disadvantage of a single side USV is that it is very difficult to maneuver to achieve very dynamic targets. While for multi sides system easily maneuvered though x-y-z coordinates. In addition to security defense purposes, multi-side system is also good for maritime intelligence, surveillance. In this case, electric deducted fan with Multi-Side system so that the vehicle can still operate even in reverse condition. Multipleside USV experiments have done with good results. In a USV study designed to use two propulsions.

The mathematical model (scheme) of a multi-level comparison of the economic system, characterized by the system of indices, is worked out. In the mathematical model of the multi-level comparison of the economic systems, the indicators of peer review and forecasting of the economic system under consideration can be used. The model can take into account the uncertainty in the estimated values of the parameters or expert estimations. The model uses the multi-criteria approach based on the Pareto solutions.

Water resources systems with reservoirs are expected to be sensitive to climate change. Assessment studies that analyze the impact of climate change on the performance of reservoirs can be divided in two groups: (1) Studies that simulate the operation under projected inflows with the current set of operational rules. Due to non adapted operational rules the future performance of these reservoirs can be underestimated and the impact overestimated. (2) Studies that optimize the operational rules for best adaption of the system to the projected conditions before the assessment of the impact. The latter allows for estimating more realistically future performance and adaption strategies based on new operation rules are available if required. Multi-purpose reservoirs serve various, often conflicting functions. If all functions cannot be served simultaneously at a maximum level, an effective compromise between multiple objectives of the reservoir operation has to be provided. Yet under climate change the historically preferenced compromise may no longer be the most suitable compromise in the future. Therefore a multi-objective based climate change impact assessment approach for multi-purpose multi-reservoir systems is proposed in the study. Projected inflows are provided in a first step using a physically based rainfall-runoff model. In a second step, a time series model is applied to generate long-term inflow time series. Finally, the long-term inflow series are used as driving variables for a simulation-based multi-objective optimization of the reservoir system in order to derive optimal operation rules. As a result, the adapted Pareto-optimal set of diverse best compromise solutions can be presented to the decision maker in order to assist him in assessing climate change adaption measures with respect to the future performance of the multi-purpose reservoir system. The approach is tested on a multi-purpose multi-reservoir system in a mountainous catchment in Germany. A

Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

An apparatus and methods for a magnetic field positioning system use a fundamentally different, and advantageous, signal structure and multiple access method, known as Code Division Multiple Access (CDMA). This signal architecture, when combined with processing methods, leads to advantages over the existing technologies, especially when applied to a system with a large number of magnetic field generators (beacons). Beacons at known positions generate coded magnetic fields, and a magnetic sensor measures a sum field and decomposes it into component fields to determine the sensor position and orientation. The apparatus and methods can have a large `building-sized` coverage area. The system allows for numerous beacons to be distributed throughout an area at a number of different locations. A method to estimate position and attitude, with no prior knowledge, uses dipole fields produced by these beacons in different locations.

textabstractIn this article we give an overview of the literature on multi-component maintenance optimization. We focus on work appearing since the 1991 survey "A survey of maintenance models for multi-unit systems" by Cho and Parlar. This paper builds forth on the review article by Dekker et al.

The physical layer of 5G cellular communications systems is designed to achieve better flexibility in an effort to support diverse services and user requirements. OFDM waveform parameters are enriched with flexible multi-numerology structures. This paper describes the differences between Long Term Evolution (LTE) systems and new radio (NR) from the flexibility perspective. Research opportunities for multi-numerology systems are presented in a structured manner. Finally, inter-numerology inter...

A multi-model approach for system diagnosis is presented in this paper. The relation with fault diagnosis as well as performance validation is considered. The approach is based on testing a number of pre-described models and find which one is the best. It is based on an active approach......,i.e. an auxiliary input to the system is applied. The multi-model approach is applied on a wind turbine system....

systems are described from a system theory point of view. Various system theory concepts and research topics which have applicability to this class of...systems are identified and briefly described. The subject of multi-body dynamics is presented in a vector space setting and is related to system theory concepts. (Author)

This paper considers the optimal element sequencing in a linear multi-state multiple sliding window system that consists of n linearly ordered multi-state elements. Each multi-state element can have different states: from complete failure up to perfect functioning. A performance rate is associated with each state. The failure of type i in the system occurs if for any i (1≤i≤I) the cumulative performance of any r i consecutive elements is lower than w i . The element sequence strongly affects the probability of any type of system failure. The sequence that minimizes the probability of certain type of failure can provide high probability of other types of failures. Therefore the optimization problem for the multiple sliding window system is essentially multi-objective. The paper formulates and solves the multi-objective optimization problem for the multiple sliding window systems. A multi-objective Genetic Algorithm is used as the optimization engine. Illustrative examples are presented.

Addressing the open problem of engineering normative open systems using the multi-agent paradigm, normative open systems are explained as systems in which heterogeneous and autonomous entities and institutions coexist in a complex social and legal framework that can evolve to address the different and often conflicting objectives of the many stakeholders involved. Presenting a software engineering approach which covers both the analysis and design of these kinds of systems, and which deals with the open issues in the area, ROMAS (Regulated Open Multi-Agent Systems) defines a specific multi-agent architecture, meta-model, methodology and CASE tool. This CASE tool is based on Model-Driven technology and integrates the graphical design with the formal verification of some properties of these systems by means of model checking techniques. Utilizing tables to enhance reader insights into the most important requirements for designing normative open multi-agent systems, the book also provides a detailed and easy t...

The US Department of Energy Office of Civilian Radioactive Waste Management is developing a Multi-Purpose Canister system to promote compatibility between the waste program elements of storage, transportation, and disposal. The development of a Multi-Purpose Canister system requires meeting various regulatory requirements. These regulatory requirements are set forth in environmental and Nuclear Regulatory Commission (NRC) regulations. This paper discusses the more significant regulatory issues that must be addressed in the development of a Multi-Purpose Canister system by the Department of Energy

A battery-operated, microcomputer-controlled monitoring device linked with a cordless telephone has been developed for remote measurements. This environmental sensor is self-contained and collects and processes data according to the information sent to its on-board computer system. An RCA model 1805 microprocessor forms the basic controller with a program encoded in memory for data acquisition and analysis. Signals from analog sensing devices used to monitor the environment are converted into digital signals and stored in random access memory of the microcomputer. This remote sensing system is linked to the laboratory by means of a cordless telephone whose base unit is connected to regular telephone lines. This offshore sensing system is simply accessed by a phone call originating from a computer terminal in the laboratory. Data acquisition is initiated upon request: Information continues to be processed and stored until the computer is reprogrammed by another phone call request. Information obtained may be recalled by a phone call after the desired environmental measurements are finished or while they are in progress. Data sampling parameters may be reset at any time, including in the middle of a measurement cycle. The range of the system is limited only by existing telephone grid systems and by the transmission characteristics of the cordless phone used as a communications link. This use of a cordless telephone, coupled with the on-board computer system, may be applied to other field studies requiring data transfer between an on-site analytical system and the laboratory.

Authorization systems today are increasingly complex. They span domains of administration, rely on many different authentication sources, and manage permissions that can be as complex as the system itself. Worse still, while there are many standards that define authentication mechanisms, the standards that address authorization are less well defined and tend to work only within homogeneous systems. This paper presents XACML, a standard access control language, as one component of a distributed and inter-operable authorization framework. Several emerging systems which incorporate XACML are discussed. These discussions illustrate how authorization can be deployed in distributed, decentralized systems. Finally, some new and future topics are presented to show where this work is heading and how it will help connect the general components of an authorization system.

Consider the problem of maximizing the total power produced by a wind farm. Due to aerodynamic interactions between wind turbines, each turbine maximizing its individual power---as is the case in present-day wind farms---does not lead to optimal farm-level power capture. Further, there are no good models to capture the said aerodynamic interactions, rendering model based optimization techniques ineffective. Thus, model-free distributed algorithms are needed that help turbines adapt their power production on-line so as to maximize farm-level power capture. Motivated by such problems, the main focus of this dissertation is a distributed model-free optimization problem in the context of multi-agent systems. The set-up comprises of a fixed number of agents, each of which can pick an action and observe the value of its individual utility function. An individual's utility function may depend on the collective action taken by all agents. The exact functional form (or model) of the agent utility functions, however, are unknown; an agent can only measure the numeric value of its utility. The objective of the multi-agent system is to optimize the welfare function (i.e. sum of the individual utility functions). Such a collaborative task requires communications between agents and we allow for the possibility of such inter-agent communications. We also pay attention to the role played by the pattern of such information exchange on certain aspects of performance. We develop two algorithms to solve this problem. The first one, engineered Interactive Trial and Error Learning (eITEL) algorithm, is based on a line of work in the Learning in Games literature and applies when agent actions are drawn from finite sets. While in a model-free setting, we introduce a novel qualitative graph-theoretic framework to encode known directed interactions of the form "which agents' action affect which others' payoff" (interaction graph). We encode explicit inter-agent communications in a directed

Full Text Available Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1 a cloud-resolving model (CRM, (2 a regional-scale model, the NASA unified Weather Research and Forecasting Model (WRF, and (3 a coupled CRM-GCM (general circulation model, known as the Goddard Multi-scale Modeling Framework or MMF. The same cloud-microphysical processes, long- and short-wave radiative transfer and land-surface processes are applied in all of the models to study explicit cloud-radiation and cloud-surface interactive processes in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator for comparison and validation with NASA high-resolution satellite data.

This paper reviews the development and presents some applications of the multi-scale modeling system, including results from using the multi-scale modeling system to study the interactions between clouds, precipitation, and aerosols. In addition, use of the multi-satellite simulator to identify the strengths and weaknesses of the model-simulated precipitation processes will be discussed as well as future model developments and applications.

Murine models are used extensively in biological and translational research. For many of these studies it is necessary to access the vasculature for the injection of biologically active agents. Among the possible methods for accessing the mouse vasculature, tail vein injections are a routine but critical step for many experimental protocols. To perform successful tail vein injections, a high skill set and experience is required, leaving most scientists ill-suited to perform this task. This can lead to a high variability between injections, which can impact experimental results. To allow more scientists to perform tail vein injections and to decrease the variability between injections, a vascular accesssystem (VAS) that semi-automatically inserts a needle into the tail vein of a mouse was developed. The VAS uses near infrared light, image processing techniques, computer controlled motors, and a pressure feedback system to insert the needle and to validate its proper placement within the vein. The VAS was tested by injecting a commonly used radiolabeled probe (FDG) into the tail veins of five mice. These mice were then imaged using micro-positron emission tomography to measure the percentage of the injected probe remaining in the tail. These studies showed that, on average, the VAS leaves 3.4% of the injected probe in the tail. With these preliminary results, the VAS system demonstrates the potential for improving the accuracy of tail vein injections in mice. (paper)

Currently the multi-terminal DC system (MTDC) has more stations. Each station needs operators to monitor and control the device. It needs much more operation and maintenance, low efficiency and small reliability; for the most important reason, multi-terminal DC system has complex control mode. If one of the stations has some problem, the control of the whole system should have problems. According to research of the characteristics of multi-terminal DC (VSC-MTDC) systems, this paper presents a strong implementation of the multi-terminal DC Supervisory Control and Data Acquisition (SCADA) system. This system is intelligent, can be networking, integration and intelligent. A master control system is added in each station to communication with the other stations to send current and DC voltage value to pole control system for each station. Based on the practical application and information feedback in the China South Power Grid research center VSC-MTDC project, this system is higher efficiency and save the cost on the maintenance of convertor station to improve the intelligent level and comprehensive effect. And because of the master control system, a multi-terminal system hierarchy coordination control strategy is formed, this make the control and protection system more efficiency and reliability.

The EC policy for research in the new millennium supports the development of european-scale research infrastructures. In this perspective, the existing research infrastructures are going to be integrated with the objective to increase their accessibility and to enhance the usability of their multidisciplinary data. Building up integrating Earth Sciences infrastructures in Europe is the mission of the Implementation Phase (IP) of the European Plate Observing System (EPOS) project (2015-2019). The integration of european multiscale laboratories - analytical, experimental petrology and volcanology, magnetic and analogue laboratories - plays a key role in this context and represents a specific task of EPOS IP. In the frame of the WP16 of EPOS IP working package 16, European geosciences multiscale laboratories aims to be linked, merging local infrastructures into a coherent and collaborative network. In particular, the EPOS IP WP16-task 4 "Data services" aims at standardize data and data products, already existing and newly produced by the participating laboratories, and made them available through a new digital platform. The following data and repositories have been selected for the purpose: 1) analytical and properties data a) on volcanic ash from explosive eruptions, of interest to the aviation industry, meteorological and government institutes, b) on magmas in the context of eruption and lava flow hazard evaluation, and c) on rock systems of key importance in mineral exploration and mining operations; 2) experimental data describing: a) rock and fault properties of importance for modelling and forecasting natural and induced subsidence, seismicity and associated hazards, b) rock and fault properties relevant for modelling the containment capacity of rock systems for CO2, energy sources and wastes, c) crustal and upper mantle rheology as needed for modelling sedimentary basin formation and crustal stress distributions, d) the composition, porosity, permeability, and

variations and dynamics, and energy system analysis, which fails to consider process integration synergies in local systems. The primary objective of the thesis is to derive a methodology for linking process design practices with energy system analysis for enabling coherent and holistic design optimization...... of flexible multi-generation system. In addition, the case study results emphasize the importance of considering flexible operation, systematic process integration, and systematic assessment of uncertainties in the design optimization. It is recommended that future research focus on assessing system impacts...... from flexible multi-generation systems and performance improvements from storage options....

This paper presents an approach for accessibility categorization in areas where there is no extensive data available to run the conventional analysis. The presented method has the advantages of the lower data requirements and the utility of the results. Three benchmarks were selected to evaluate transit accessibility. The first one (transit coverage) investigates the spread of the service, and it is used to assess the percentage of people in a district that can access the service within a com...

National Aeronautics and Space Administration — Stellarray proposes the development of a highly novel Multi-Purpose X-ray Source and System (MPXS), for use on flight missions, space stations, planetary excursions...

National Aeronautics and Space Administration — The proposed Multi-Purpose X-ray Source and System (MPXS) can be used on flight missions, space stations, planetary excursions and planetary or asteroid bases, to...

of control. Our goal is to derive a worst case delay by which the system completes execution, such that this delay is as small as possible; to generate a logically and temporally deterministic schedule; and to optimize parameters of the communication protocol such that this delay is guaranteed. We have......In this paper, we concentrate on aspects related to the synthesis of distributed embedded systems consisting of programmable processors and application-specific hardware components. The approach is based on an abstract graph representation that captures, at process level, both dataflow and the flow......, generates an efficient bus access scheme as well as the schedule tables for activation of processes and communications....

Electronic Health Record (EHR) is the heart element of any e-health system, which aims at improving the quality and efficiency of healthcare through the use of information and communication technologies. The sensitivity of the data contained in the health record poses a great challenge to security. In this paper we propose a security architecture for EHR systems that are conform with IHE profiles. In this architecture we are tackling the problems of access control and privacy. Furthermore, a prototypical implementation of the proposed model is presented.

In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

The present work discusses the possibilities offered by the evolution of Information and Communication Technologies with the aim of designing a system to dynamically obtain knowledge of accessibility issues in urban environments. This system is facilitated by technology to analyse the urban user experience and movement accessibility, which enabling accurate identification of urban barriers and monitoring its effectiveness over time. Therefore, the main purpose of the system is to meet the real needs and requirements of people with movement disabilities. The information obtained can be provided as a support service for decision-making to be used by city government, institutions, researchers, professionals and other individuals of society in general to improve the liveability and quality of the lives of citizens. The proposed system is a means of social awareness that makes the most vulnerable groups of citizens visible by involving them as active participants. To perform and implement the system, the latest communication and positioning technologies for smart sensing have been used, as well as the cloud computing paradigm. Finally, to validate the proposal, a case study has been presented using the university environment as a pre-deployment step in urban environments.

Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

We describe the first DNA-based storage architecture that enables random access to data blocks and rewriting of information stored at arbitrary locations within the blocks. The newly developed architecture overcomes drawbacks of existing read-only methods that require decoding the whole file in order to read one data fragment. Our system is based on new constrained coding techniques and accompanying DNA editing methods that ensure data reliability, specificity and sensitivity of access, and at the same time provide exceptionally high data storage capacity. As a proof of concept, we encoded parts of the Wikipedia pages of six universities in the USA, and selected and edited parts of the text written in DNA corresponding to three of these schools. The results suggest that DNA is a versatile media suitable for both ultrahigh density archival and rewritable storage applications.

As the development of computer science and smart health-care technology, there is a trend for patients to enjoy medical care at home. Taking enormous users in the Smart Health-care System into consideration, access control is an important issue. Traditional access control models, discretionary access control, mandatory access control, and role-based access control, do not properly reflect the characteristics of Smart Health-care System. This paper proposes an advanced access control model for...

More than 600 multi-planet systems are known. The vast majority of these systems have been discovered by NASA's Kepler spacecraft, but dozens were found using the Doppler technique, the first multi-exoplanet system was identified through pulsar timing, and the most massive system has been found using imaging. More than one-third of the 4000+ planet candidates found by NASA's Kepler spacecraft are associated with target stars that have more than one planet candidate, and the large number of such Kepler "multis" tells us that flat multiplanet systems like our Solar System are common. Virtually all of Kepler candidate multis are stable, as tested by numerical integrations that assume a physically motivated mass-radius relationship. Statistical studies performed on these candidate systems reveal a great deal about the architecture of planetary systems, including the typical spacing of orbits and flatness. The characteristics of several of the most interesting confirmed multi-exoplanet systems will also be discussed.HR 8799's four massive planets orbit tens of AU from their host star and travel on nearly circular orbits. PSR B1257+12 has three much smaller planets orbiting close to a neutron star. Both represent extremes and show that planet formation is a robust process that produces a diversity of outcomes. Although both exomoons and Trojan (triangle Lagrange point) planets have been searched for, neither has yet been found.

optimization method which is very robust and widely used to solve problems usually difficult to handle by traditional methods. Genetic algorithms (GAs) have been used in previous research and proved to be efficient in optimizing heat exchangers networks (HEN) (Dipama et al., 2008). So, HEN have been synthesized to recover the maximum heat in an industrial process. The optimization problem formulated in the context of this work consists of a single objective, namely the maximization of energy recovery. The optimization algorithm developed in this thesis extends the ability of GAs by taking into account several objectives simultaneously. This algorithm provides an innovation in the method of finding optimal solutions, by using a technique which consist of partitioning the solutions space in the form of parallel grids called "watching corridors". These corridors permit to specify areas (the observation corridors) in which the most promising feasible solutions are found and used to guide the search towards optimal solutions. A measure of the progress of the search is incorporated into the optimization algorithm to make it self-adaptive through the use of appropriate genetic operators at each stage of optimization process. The proposed method allows a fast convergence and ensure a diversity of solutions. Moreover, this method gives the algorithm the ability to overcome difficulties associated with optimizing problems with complex Pareto front landscapes (e.g., discontinuity, disjunction, etc.). The multi-objective optimization algorithm has been first validated using numerical test problems found in the literature as well as energy systems optimization problems. Finally, the proposed optimization algorithm has been applied for the optimization of the secondary loop of Gentilly-2 nuclear power plant, and a set of solutions have been found which permit to make the power plant operate in optimal conditions. (Abstract shortened by UMI.)

Bi-partite entanglement in multi-qubit systems cannot be shared freely. The rules of quantum mechanics impose bounds on how multi-qubit systems can be correlated. In this paper, we utilize a concept of entangled graphs with weighted edges in order to analyse pure quantum states of multi-qubit systems. Here qubits are represented by vertexes of the graph, while the presence of bi-partite entanglement is represented by an edge between corresponding vertexes. The weight of each edge is defined to be the entanglement between the two qubits connected by the edge, as measured by the concurrence. We prove that each entangled graph with entanglement bounded by a specific value of the concurrence can be represented by a pure multi-qubit state. In addition, we present a logic network with O(N 2 ) elementary gates that can be used for preparation of the weighted entangled graphs of N qubits

Bi-partite entanglement in multi-qubit systems cannot be shared freely. The rules of quantum mechanics impose bounds on how multi-qubit systems can be correlated. In this paper, we utilize a concept of entangled graphs with weighted edges in order to analyse pure quantum states of multi-qubit systems. Here qubits are represented by vertexes of the graph, while the presence of bi-partite entanglement is represented by an edge between corresponding vertexes. The weight of each edge is defined to be the entanglement between the two qubits connected by the edge, as measured by the concurrence. We prove that each entangled graph with entanglement bounded by a specific value of the concurrence can be represented by a pure multi-qubit state. In addition, we present a logic network with O(N2) elementary gates that can be used for preparation of the weighted entangled graphs of N qubits.

In this paper we consider vulnerable systems which can have different states corresponding to different combinations of available elements composing the system. Each state can be characterized by a performance rate, which is the quantitative measure of a system's ability to perform its task. Both the impact of external factors (stress) and internal causes (failures) affect system survivability, which is determined as probability of meeting a given demand. In order to increase the survivability of the system, a multi-level protection is applied to its subsystems. This means that a subsystem and its inner level of protection are in their turn protected by the protection of an outer level. This double-protected subsystem has its outer protection and so forth. In such systems, the protected subsystems can be destroyed only if all of the levels of their protection are destroyed. Each level of protection can be destroyed only if all of the outer levels of protection are destroyed. We formulate the problem of finding the structure of series-parallel multi-state system (including choice of system elements, choice of structure of multi-level protection and choice of protection methods) in order to achieve a desired level of system survivability by the minimal cost. An algorithm based on the universal generating function method is used for determination of the system survivability. A multi-processor version of genetic algorithm is used as optimization tool in order to solve the structure optimization problem. An application example is presented to illustrate the procedure presented in this paper

Full Text Available Based on multiDEVS formalism, we introduce multiPDEVS, a parallel and nonmodular formalism for discrete event system specification. This formalism provides combined advantages of PDEVS and multiDEVS approaches, such as excellent simulation capabilities for simultaneously scheduled events and components able to influence each other using exclusively their state transitions. We next show the soundness of the formalism by giving a construction showing that any multiPDEVS model is equivalent to a PDEVS atomic model. We then present the simulation procedure associated, usually called abstract simulator. As a well-adapted formalism to express cellular automata, we finally propose to compare an implementation of multiPDEVS formalism with a more classical Cell-DEVS implementation through a fire spread application.

This book is unique in presenting channels, techniques and standards for the next generation of MIMO wireless networks. Through a unified framework, it emphasizes how propagation mechanisms impact the system performance under realistic power constraints. Combining a solid mathematical analysis with a physical and intuitive approach to space-time signal processing, the book progressively derives innovative designs for space-time coding and precoding as well as multi-user and multi-cell techniques, taking into consideration that MIMO channels are often far from ideal. Reflecting developments

This paper introduces a multi-channel and high-speed time recorder system, which was originally designed to work in the experiments of quantum cryptography research. The novelty of the system is that all the hardware logic is performed by only one FPGA. The system can achieve several desirable features, such as simplicity, high resolution and high processing speed. (authors)

We study algorithmic problems in multi-stage open shop processing systems that are centered around reachability and deadlock detection questions. We characterize safe and unsafe system states. We show that it is easy to recognize system states that can be reached from the initial state (where the

The U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center has implemented a number of updates to its suite of remotely sensed products and distribution systems. These changes will greatly expand the availability, accessibility, and usability of the image products from USGS. As of late 2017, several new datasets are available for public download at no charge from USGS/EROS Center. These products include Multispectral Instrument (MSI) Level-1C data from the Sentinel-2B satellite, which was launched in March 2017. Along with Sentinel-2A, the Sentinel-2B images are now being distributed through USGS systems as part of a collaborative effort with the European Space Agency (ESA). The Sentinel-2 imagery is highly complementary to multispectral data collected by the USGS Landsat 7 and 8 satellites. With these two missions operating together, the potential local revisit rate can be reduced to 2-4 days. Another product addition is Resourcesat-2 data acquired over the United States by the Indian Space Research Organisation (ISRO). The Resourcesat-2 products from USGS consist of Advanced Wide Field Sensor (AWiFS) and Linear Imaging Self-Scanning Sensor Three (LISS-3) images acquired August 2016 to present. In an effort to maximize future Landsat data interoperability, including time series analysis of the 45+ year archive, the reprocessing of Collection 1 for all historical Landsat Level 1 products is nearly complete. The USGS is now working on operational release of higher-level science products to support analysis of the Landsat archive at the pixel level. Major upgrades were also completed in 2017 for several USGS data discovery and accesssystems, including the LandsatLook Viewer (https://landsatlook.usgs.gov/) and GloVis Tool (https://glovis.usgs.gov/). Other options are now being developed to further enhance data access and overall user experience. These future options will be discussed and community feedback will be encouraged.

Nuclear power plant are complex multi-variable dynamically interactive systems which employ many facets of systems and control theory in their analysis and design. Whole plant mathematical models must be developed and validated and in addition to their obvious role in control system synthesis and design, they are also widely used for operational constraint and plant malfunction analysis. The need for and scope of an integrated power plant control system is discussed and, as a specific example, the design of an integrated feedwater regulator is reviewed. The multi-variable frequency response analysis employed in the design is described in detail. (author)

In this thesis, the author explored multi-source management problems of remote sensing data. The main idea is to use the mosaic dataset model, and the ways of an integreted display of image and its interpretation. Based on ArcGIS and IMINT feature knowledge platform, the author used the C# and other programming tools for development work, so as to design and implement multi-source remote sensing data management system function module which is able to simply, conveniently and efficiently manage multi-source remote sensing data. (authors)

MARS (Multi-dimensional Analysis of Reactor Safety) code is being developed by KAERI for the realistic thermal-hydraulic simulation of light water reactor system transients. MARS 1.4 has been developed as a final version of basic code frame for the multi-dimensional analysis of system thermal-hydraulics. Since MARS 1.3, MARS 1.4 has been improved to have the enhanced code capability and user friendliness through the unification of input/output features, code models and code functions, and through the code modernization. Further improvements of thermal-hydraulic models, numerical method and user friendliness are being carried out for the enhanced code accuracy. As a multi-purpose safety analysis code system, a coupled analysis system, MARS/MASTER/CONTEMPT, has been developed using multiple DLL (Dynamic Link Library) techniques of Windows system. This code system enables the coupled, that is, more realistic analysis of multi-dimensional thermal-hydraulics (MARS 2.0), three-dimensional core kinetics (MASTER) and containment thermal-hydraulics (CONTEMPT). This paper discusses the MARS development program, and the developmental progress of the MARS 1.4 and the MARS/MASTER/CONTEMPT focusing on major features of the codes and their verification. It also discusses thermal hydraulic models and new code features under development. (author)

MARS (Multi-dimensional Analysis of Reactor Safety) code is being developed by KAERI for the realistic thermal-hydraulic simulation of light water reactor system transients. MARS 1.4 has been developed as a final version of basic code frame for the multi-dimensional analysis of system thermal-hydraulics. Since MARS 1.3, MARS 1.4 has been improved to have the enhanced code capability and user friendliness through the unification of input/output features, code models and code functions, and through the code modernization. Further improvements of thermal-hydraulic models, numerical method and user friendliness are being carried out for the enhanced code accuracy. As a multi-purpose safety analysis code system, a coupled analysis system, MARS/MASTER/CONTEMPT, has been developed using multiple DLL (Dynamic Link Library) techniques of Windows system. This code system enables the coupled, that is, more realistic analysis of multi-dimensional thermal-hydraulics (MARS 2.0), three-dimensional core kinetics (MASTER) and containment thermal-hydraulics (CONTEMPT). This paper discusses the MARS development program, and the developmental progress of the MARS 1.4 and the MARS/MASTER/CONTEMPT focusing on major features of the codes and their verification. It also discusses thermal hydraulic models and new code features under development. (author)

This paper describes an adaptive fuzzy sliding-mode control algorithm for controlling unknown or uncertain, multi-input multi-output (MIMO), possibly chaotic, dynamical systems. The control approach encompasses a fuzzy system and a robust controller. The fuzzy system is designed to mimic an ideal sliding-mode controller, and the robust controller compensates the difference between the fuzzy controller and the ideal one. The parameters of the fuzzy system, as well as the uncertainty bound of the robust controller, are tuned adaptively. The adaptive laws are derived in the Lyapunov sense to guarantee the asymptotic stability and tracking of the controlled system. The effectiveness of the proposed method is shown by applying it to some well-known chaotic systems.

This work shows a way to combine thermodynamic calculations and experiments in order to get useful information on the constitution of metal/non-metal systems. Many data from literature are critically evaluated and used as a basis for experiments and calculations. The following multi-component systems are treated: 1. Multi-component systems of 'ceramic' materials with partially metallic bonding (carbides, nitrides, oxides, borides, carbonitrides, borocarbides, oxinitrides of the 4-8th transition group metals) 2. multi-component systems of non-metallic materials with dominant covalent bonding (SiC, Si 3 N 4 , SiB 6 , BN, Al 4 C 3 , Be 2 C) 3. multi-component systems of non-metallic materials with dominant heteropolar bonding (Al 2 O 3 , TiO 2 , BeO, SiO 2 , ZrO 2 ). The interactions between 1. and 2., 2. and 3., 1. and 3. are also considered. The latest commercially available programmes for the calculation of thermodynamical equilibria and phase diagrams are evaluated and compared considering their facilities and limits. New phase diagrams are presented for many presently unknown multi-component systems; partly known systems are completed on the basis of selected thermodynamic data. The calculations are verified by experimental investigations (metallurgical and powder technology methods). Altogether 690 systems are evaluated, 126 are calculated for the first time and 52 systems are experimentally verified. New data for 60 ternary phases are elaborated by estimating the data limits for the Gibbs energy values. A synthesis of critical evaluation of literature, calculations and experiments leads to new important information about equilibria and reaction behaviour in multi-component systems. This information is necessary to develop new stable and metastable materials. (orig./MM) [de

In this article the theory of multibunch feedback systems is developed in a rigorous way including the fact that the elements of feedback systems are localized in the ring. The results of the theory which can be used for any strength of the systems are the base for the multibunch feedback systems for PETRA and HERA, already tested successfully in PETRA. (orig.)

The internet has provided us with a high bandwidth, low latency, globally connected network in which to rapidly share realtime data from sensors, reports, and imagery. In addition, the availability of this data is even easier to obtain, consume and analyze. Another aspect of the internet has been the increased approachability of complex systems through lightweight interfaces - with additional complex services able to provide more advanced connections into data services. These analyses and discussions have primarily been siloed within single domains, or kept out of the reach of amateur scientists and interested citizens. However, through more open access to analytical tools and data, experts can collaborate with citizens to gather information, provide interfaces for experimenting and querying results, and help make improved insights and feedback for further investigation. For example, farmers in Uganda are able to use their mobile phones to query, analyze, and be alerted to banana crop disease based on agriculture and climatological data. In the U.S., local groups use online social media sharing sites to gather data on storm-water runoff and stream siltation in order to alert wardens and environmental agencies. This talk will present various web-based geospatial visualization and analysis techniques and tools such as Google Earth and GeoCommons that have emerged that provide for a collaboration between experts of various domains as well as between experts, government, and citizen scientists. Through increased communication and the sharing of data and tools, it is possible to gain broad insight and development of joint, working solutions to a variety of difficult scientific and policy related questions.

In multi-agent system where each agen thas a different goal (even the team of agents has the same goal), agents must be able to resolve conflicts arising in the process of achieving their goal. Many researchers presented methods for conflict resolution, e.g., Reinforcement learning (RL), but the conventional RL requires a large computation cost because every agent must learn, at the same time the overlap of actions selected by each agent results in local conflict. Therefore in this paper, we propose a novel method to solve these problems. In order to deal with the conflict within the multi-agent system, the concept of potential field function based Action selection priority level (ASPL) is brought forward. In this method, all kinds of environment factor that may have influence on the priority are effectively computed with the potential field function. So the priority to access the local resource can be decided rapidly. By avoiding the complex coordination mechanism used in general multi-agent system, the conflict in multi-agent system is settled more efficiently. Our system consists of RL with ASPL module and generalized rules module. Using ASPL, RL module chooses a proper cooperative behavior, and generalized rule module can accelerate the learning process. By applying the proposed method to Robot Soccer, the learning process can be accelerated. The results of simulation and real experiments indicate the effectiveness of the method.

Transit ridership may be sensitive to fares, travel times, waiting times, and access times, among other factors. Thus, : elastic demands are considered in formulations for maximizing the system welfare for conventional and flexible bus : services. Tw...

Full Text Available Abstract Background This paper reports a simple 2-D system for electrical impedance tomography EIT, which works efficiently and is low cost. The system has been developed in the Sharif University of Technology Tehran-Iran (for the author's MSc Project. Methods The EIT system consists of a PC in which an I/O card is installed with an external current generator, a multiplexer, a power supply and a phantom with an array of electrodes. The measurement system provides 12-bit accuracy and hence, suitable data acquisition software has been prepared accordingly. The synchronous phase detection method has been implemented for voltage measurement. Different methods of image reconstruction have been used with this instrument to generate electrical conductivity images. Results The results of simulation and real measurement of the system are presented. The reconstruction programs were written in MATLAB and the data acquisition software in C++. The system has been tested with both static and dynamic mode in a 2-D domain. Better results have been produced in the dynamic mode of operation, due to the cancellation of errors. Conclusion In the spirit of open access publication the design details of this simple EIT system are made available here.

Discrepancies in socioeconomic factors have been associated with higher rates of perforated appendicitis. As an equal-access health care system theoretically removes these barriers, we aimed to determine if remaining differences in demographics, education, and pay result in disparate outcomes in the rate of perforated appendicitis. All patients undergoing appendectomy for acute appendicitis (November 2004-October 2009) at a tertiary care equal access institution were categorized by demographics and perioperative data. Rank of the sponsor was used as a surrogate for economic status. A multivariate logistic regression model was performed to determine patient and clinical characteristics associated with perforated appendicitis. A total of 680 patients (mean age 30±16 y; 37% female) were included. The majority were Caucasian (56.4% [n=384]; African Americans 5.6% [n=38]; Asians 1.9% [n=13]; and other 48.9% [n=245]) and enlisted (87.2%). Overall, 6.4% presented with perforation, with rates of 6.6%, 5.8%, and 6.7% (P=0.96) for officers, enlisted soldiers, and contractors, respectively. There was no difference in perforation when stratified by junior or senior status for either officers or enlisted (9.3% junior versus 4.40% senior officers, P=0.273; 6.60% junior versus 5.50% senior enlisted, P=0.369). On multivariate analysis, parameters such as leukocytosis and temperature, as well as race and rank were not associated with perforation (P=0.7). Only age had a correlation, with individuals aged 66-75 y having higher perforation rates (odds ratio, 1.04; 95% confidence interval, 1.02-1.05; P<0.001). In an equal-access health care system, older age, not socioeconomic factors, correlated with increased appendiceal perforation rates. Published by Elsevier Inc.

Specifying the performance of audio amplifiers is typically done by playing sine waves into a pure ohmic load. However real loudspeaker impedances are not purely ohmic but characterised by its electrical, mechanical and acoustical properties. Therefore a loudspeaker emulator capable of adjusting...... its impedance to that of a given loudspeaker is desired for measurement purposes. An adjustable RLC based emulator is implemented with switch controlled capacitors, air gap controlled inductors and potentiometers. Calculations and experimental results are compared and show that it is possible...... to emulate the loudspeaker impedance infinite baffle-, closed box- and the multi resonant vented box-loudspeaker by tuning the component values in the proposed circuit. Future work is outlined and encourage that the proposed impedance emulator is used as part of a control circuit in a switch-mode based...

Access control in multi-domain environments is an important question in building coalition between domains. Based on the RBAC access control model and the concepts of secure domain,the role delegation and role mapping are proposed, which support the third-party authorization. A distributed RBAC model is then presented. Finally implementation issues are discussed.

When a new user accesses the CDMA system, the load will change drastically, and therefore, the advanced outer loop power control (OLPC) technology has to be adopted to enrich the target signal interference ratio (SIR) and improve the system performance. The existing problems about DS-CDMA outer loop power control for multi-service are introduced and the power control theoretical model is analyzed. System simulation is adopted on how to obtain the theoretical performance and parameter optimization of the power control algorithm. The OLPC algorithm is improved and the performance comparisons between the old algorithm and the improved algorithm are given. The results show good performance of the improved OLPC algorithm and prove the validity of the improved method for multi-service.

The next generation surveillance and multimedia systems will become increasingly deployed as wireless sensor networks in order to monitor parks, public places and for business usage. The convergence of data and telecommunication over IP-based networks has paved the way for wireless networks. Functions are becoming more intertwined by the compelling force of innovation and technology. For example, many closed-circuit TV premises surveillance systems now rely on transmitting their images and da...

National Oceanic and Atmospheric Administration, Department of Commerce — These data are the result of a multi-beam echosounder survey conducted in the OHAPC by the M/V Liberty Star in October 2002. Two forms of data are available: 1)...

A recent study by the National Telecommunications and Information Administration (NTIA) has concluded that the 21st century will be the age of information in which the telecommunication infrastructure will be vital to the social and economic well being of society. To meet the challenge of the coming age, JPL has been performing studies on a personal access satellite system (PASS) for the 21st century. The PASS study can be traced back to a study in which the technical feasibility and potential applications of a high frequency, low data rate satellite system were identified using small fixed terminals. Herein, the PASS concept is described along with the strawman design. Then the key challenges are identified along with possible solutions. Finally, the plan for the future is summarized from the key results.

The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyse the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with a discussion of the potential role of this new component at the different tiers of a distributed computing grid.

Government actors have an important role to play in creating healthy public policies and supportive environments to facilitate access to safe, affordable, nutritious food. The purpose of this research was to examine Waterloo Region (Ontario, Canada) as a case study for "what works" with respect to facilitating access to healthy, local food through regional food system policy making. Policy and planning approaches were explored through multi-sectoral perspectives of: (a) the development and adoption of food policies as part of the comprehensive planning process; (b) barriers to food system planning; and (c) the role and motivation of the Region's public health and planning departments in food system policy making. Forty-seven in-depth interviews with decision makers, experts in public health and planning, and local food system stakeholders provided rich insight into strategic government actions, as well as the local and historical context within which food system policies were developed. Grounded theory methods were used to identify key overarching themes including: "strategic positioning", "partnerships" and "knowledge transfer" and related sub-themes ("aligned agendas", "issue framing", "visioning" and "legitimacy"). A conceptual framework to illustrate the process and features of food system policy making is presented and can be used as a starting point to engage multi-sectoral stakeholders in plans and actions to facilitate access to healthy food.

A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source，the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.

Multi-agent systems are promising as models of organization because they are based on the idea that most work in human organizations is done based on intelligence, communication, cooperation, and massive parallel processing. They offer an alternative for system theories of organization, which are

We provide a brief description of our Python-DTU system, including the overall design, the tools and the algorithms that we used in the Multi-Agent Programming Contest 2012, where the scenario was called Agents on Mars like in 2011. Our solution is an improvement of our Python-DTU system from last...

We provide a brief description of our Python-DTU system, including the overall design, the tools and the algorithms that we used in the Multi-Agent Programming Contest 2012, where the scenario was called Agents on Mars like in 2011. Our solution is an improvement of our Python-DTU system from last...

A multi-sensor measurement system for robotic drilling is presented. The system enables a robot to measure its 6D pose with respect to the work piece and to establish a reference coordinate system for drilling. The robot approaches the drill point and performs an orthogonal alignment with the work piece. Although the measurement systems are readily capable of achieving high position accuracy and low deviation to perpendicularity, experiments show that inaccuracies in the robot's 6D-pose and e...

We present an approach to schedulability analysis for the synthesis of multi-cluster distributed embedded systems consisting of time-triggered and event-triggered clusters, interconnected via gateways. We have also proposed a buffer size and worst case queuing delay analysis for the gateways......, responsible for routing inter-cluster traffic. Optimization heuristics for the priority assignment and synthesis of bus access parameters aimed at producing a schedulable system with minimal buffer needs have been proposed. Extensive experiments and a real-life example show the efficiency of our approaches....

An approach to schedulability analysis for the synthesis of multi-cluster distributed embedded systems consisting of time-triggered and event-triggered clusters, interconnected via gateways, is presented. A buffer size and worst case queuing delay analysis for the gateways, responsible for routing...... inter-cluster traffic, is also proposed. Optimisation heuristics for the priority assignment and synthesis of bus access parameters aimed at producing a schedulable system with minimal buffer needs have been proposed. Extensive experiments and a real-life example show the efficiency of the approaches....

Full Text Available Over the past 30 years the Marlborough Family Service in London has pioneered multi-family work with marginalized families presenting simultaneously with abuse and neglect, family violence, substance misuse, educational failure and mental illness. The approach is based on a systemicmulti-contextual mode and this chapter describes the evolving work, including the establishment of the first permanent multiple family day setting, specifically designed for and solely dedicated to the work with seemingly ‘hopeless’ families. The ingredients of ‘therapeutic assessments’ of parents and families are outlined and the importance of initial network meetings with professionals and family members is emphasized.

This paper deals with the problem of secure communication based on multi-input multi-output (MIMO) chaotic systems. Single input secure communication based on chaos can be easily extended to multiple ones by some combinations technologies, however all the combined inputs possess the same risk to be broken. In order to reduce this risk, a new secure communication scheme based on chaos with MIMO is discussed in this paper. Moreover, since the amplitude of messages in traditional schemes is limited because it would affect the quality of synchronization, the proposed scheme is also improved into an amplitude-independent one.

We have developed and installed a Medical Image AccessSystem in an intensive care unit. Images are acquired and transmitted automatically to this system, thus expanding on the previous results of Shile et. al. It is our goal to determine what effect regular, sustained availability of image data in the clinic has on the Intensive Care Unit and the Department of Radiology. Our system is installed and has been in regular use in the hospital since late August of 1993. Since the time of installation we have been collecting usage information from both the manual and automated systems. From this data we are performing the standard measures established by DeSimone et. al. Our initial results support the original findings that image availability in the clinic leads to earlier patient care decision based on the image data. However, our findings do not seem to indicate that there is a breakdown of communication between the clinician and the radiologist as a result of the use of the clinical display system. In addition to the established measure we are investigating other criteria to measure time saved by both the clinician and radiologist. The results are reported in this paper.

A medical imaging system provides simultaneous rendering of visible light and fluorescent images. The system may employ dyes in a small-molecule form that remain in the subject's blood stream for several minutes, allowing real-time imaging of the subject's circulatory system superimposed upon a conventional, visible light image of the subject. The system may provide an excitation light source to excite the fluorescent substance and a visible light source for general illumination within the same optical guide used to capture images. The system may be configured for use in open surgical procedures by providing an operating area that is closed to ambient light. The systems described herein provide two or more diagnostic imaging channels for capture of multiple, concurrent diagnostic images and may be used where a visible light image may be usefully supplemented by two or more images that are independently marked for functional interest.

Full Text Available Persons who have undergone a laryngectomy have a few options to partially restore speech but no completely satisfactory device. Even though the use of an electrolarynx (EL is the easiest way for a patient to produce speech, it does not produce a natural tone and appearance is far from normal. Because of that and the fact that none of them are hands-free, the feasibility of using a motion sensor to replace a conventional EL user interface has been explored. A mobile device motion sensor with multi-agent platform has been used to investigate on/off and pitch frequency control capability. A very small battery operated ARM-based control unit has also been developed to evaluate the motion sensor based user-interface. This control unit is placed on the wrist and the vibration device against the throat using support bandage. Two different conversion methods were used for the forearm tilt angle to pitch frequency conversion: linear mapping method and F0 template-based method A perceptual evaluation has been performed with two well-trained normal speakers and ten subjects. The results of the evaluation study showed that both methods are able to produce better speech quality in terms of the naturalness.

The evolution of laser fusion systems started with a description of the basic principles of the laser in 1959, then a physical demonstration showing 1000 Watts of peak optical power in 1961 to the present systems that deliver 10 14 watts of peak optical power, are presented. Physical limits to large systems are reviewed: thermal limits, material stress limits, structural limits and stability, parasitic coupling, measurement precision and diagnostics. The various steps of the fusion laser-system development process are then discussed through an historical presentation. 3 figs., 8 refs

Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.

Full Text Available A modern controller was designed by using the mathematical model of a multi–zone thermal plate system. An important requirement for this type of controller is that it must be able to keep the temperature set-point of each thermal zone. The mathematical model used in the design was determined through a system identification process. The results showed that when the operating condition is changed, the performance of the controller may be reduced as a result of the system parameter uncertainties. This paper proposes a weighting technique of combining the robust model predictive controller for each operating condition into a single robust multi-model predictive control. Simulation and experimental results showed that the proposed method performed better than the conventional multi-model predictive control in rise time of transient response, when used in a system designed to work over a wide range of operating conditions.

Observational surveys with Kepler and other telescopes have shown that multi-planet systems are very numerous. Considering the secular dynamcis of multi-planet systems provides substantial insight into the interactions between planets in those systems. Since the underlying secular structure of a multi-planet system (the secular eigenmodes) can be calculated using only the planets' masses and semi-major axes, one can elucidate the eccentricity and inclination behavior of planets in those systems even without knowing the planets' current eccentricities and inclinations. We have calculated both the eccentricity and inclination secular eigenmodes for the population of known multi-planet systems whose planets have well determined masses and periods. We will discuss the commonality of dynamically grouped planets ('groupies') vs dynamically uncoupled planets ('loners'), and compare to what would be expected from randomly generated systems with the same overall distribution of masses and semi-major axes. We will also discuss the occurrence of planets that strongly influence the behavior of other planets without being influenced by those others ('overlords'). Examples will be given and general trends will be discussed.

Cache-enabled base station (BS) densification, denoted as a fog radio access network (F-RAN), is foreseen as a key component of 5G cellular networks. F-RAN enables storing popular files at the network edge (i.e., BS caches), which empowers local

Highlights: • We developed “predictive modeling of coupled multi-physics systems (PMCMPS)”. • PMCMPS reduces predicted uncertainties in predicted model responses and parameters. • PMCMPS treats efficiently very large coupled systems. - Abstract: This work presents an innovative mathematical methodology for “predictive modeling of coupled multi-physics systems (PMCMPS).” This methodology takes into account fully the coupling terms between the systems but requires only the computational resources that would be needed to perform predictive modeling on each system separately. The PMCMPS methodology uses the maximum entropy principle to construct an optimal approximation of the unknown a priori distribution based on a priori known mean values and uncertainties characterizing the parameters and responses for both multi-physics models. This “maximum entropy”-approximate a priori distribution is combined, using Bayes’ theorem, with the “likelihood” provided by the multi-physics simulation models. Subsequently, the posterior distribution thus obtained is evaluated using the saddle-point method to obtain analytical expressions for the optimally predicted values for the multi-physics models parameters and responses along with corresponding reduced uncertainties. Noteworthy, the predictive modeling methodology for the coupled systems is constructed such that the systems can be considered sequentially rather than simultaneously, while preserving exactly the same results as if the systems were treated simultaneously. Consequently, very large coupled systems, which could perhaps exceed available computational resources if treated simultaneously, can be treated with the PMCMPS methodology presented in this work sequentially and without any loss of generality or information, requiring just the resources that would be needed if the systems were treated sequentially

Full Text Available An opportunity wireless charging system for electric vehicles when they stop and wait at traffic lights is proposed in this paper. In order to solve the serious power fluctuation caused by random access loads, this study presents a power stabilization strategy based on counting the number of electric vehicles in a designated area, including counting method, power source voltage adjustment strategy and choice of counting points. Firstly, the circuit model of a wireless power system with multi-loads is built and the equation of each load is obtained. Secondly, after the counting method of electric vehicles is stated, the voltage adjustment strategy, based on the number of electric vehicles when the system is at a steady state, is set out. Then, the counting points are chosen according to power curves when the voltage adjustment strategy is adopted. Finally, an experimental prototype is implemented to verify the power stabilization strategy. The experimental results show that, with the application of this strategy, the charging power is stabilized with the fluctuation of no more than 5% when loads access randomly.

We consider the escape of the particles over fluctuating potential barrier for a system only driven by a multi-state noise. It is shown that, the noise can make the particles escape over the fluctuating potential barrier in some circumstances; but in other circumstances, it can not. If the noise can make the particle escape over the fluctuating potential barrier, the mean first passage time (MFPT) can display the phenomenon of multi-resonant-activation. For this phenomenon, there are two kinds of resonant activation to appear. One is resonant activation for the MFPTs as the function of the flipping rates of the fluctuating potential barrier; the other is that for the MFPTs as the functions of the transition rates of the multi-state noise. (general)

Entropy production in real processes is directly associated with the dissipation of energy. Both are potential measures for the proceed of irreversible processes taking place in metallurgical systems. Many of these processes in multi-phase-systems could then be modelled on the basis of the energy-dissipation associated with. As this entity can often be estimated using very simple assumptions from first principles, the evolution of an overall measure of systems behaviour can be studied constructing an energy-dissipation -based model of the system. In this work a formulation of this concept, the Energy-Dissipation-Model (EDM), for metallurgical multi-phase-systems is given. Special examples are studied to illustrate the concept, and benefits as well as the range of validity are shown. This concept might be understood as complement to usual CFD-modelling of complex systems on a more abstract level but reproducing essential attributes of complex metallurgical systems. (author)

Entropy production in real processes is directly associated with the dissipation of energy. Both are potential measures for the proceed of irreversible processes taking place in metallurgical systems. Many of these processes in multi-phase-systems could then be modelled on the basis of the energy-dissipation associated with. As this entity can often be estimated using very simple assumptions from first principles, the evolution of an overall measure of systems behaviour can be studied constructing an energy-dissipation -based model of the system. In this work a formulation of this concept, the Energy-Dissipation-Model (EDM), for metallurgical multi-phase-systems is given. Special examples are studied to illustrate the concept, and benefits as well as the range of validity are shown. This concept might be understood as complement to usual CFD-modelling of complex systems on a more abstract level but reproducing essential attributes of complex metallurgical systems. (author)

A multi-channel temperature measurement system for monitoring of automotive battery stack is presented in the paper. The presented system is a complete battery temperature measuring system for hybrid / electric vehicles that incorporates multi-channel temperature measurements with digital temperature sensors communicating through 1-Wire buses, individual 1-Wire bus for each sensor for parallel computing (parallel measurements instead of sequential), FPGA device which collects data from sensors and translates it for CAN bus frames. CAN bus is incorporated for communication with car Battery Management System and uses additional CAN bus controller which communicates with FPGA device through SPI bus. The described system can parallel measure up to 12 temperatures but can be easily extended in the future in case of additional needs. The structure of the system as well as particular devices are described in the paper. Selected results of experimental investigations which show proper operation of the system are presented as well.

Real-time reactor simulator had been developed by reusing the equipment of the Musashi reactor and its performance improvement became indispensable for research tools to increase sampling rate with introduction of arithmetic units using multi-Digital Signal Processor(DSP) system (cluster). In order to realize the heterogeneous cluster type multi-processor system computing, combination of two kinds of Control Processor (CP) s, Cluster Control Processor (CCP) and System Control Processor (SCP), were proposed with Large System Control Processor (LSCP) for hierarchical cluster if needed. Faster computing performance of this system was well evaluated by simulation results for simultaneous execution of plural jobs and also pipeline processing between clusters, which showed the system led to effective use of existing system and enhancement of the cost performance. (T. Tanaka)

In the context of resource allocation in cloud- radio access networks, recent studies assume either signal-level or scheduling-level coordination. This paper, instead, considers a hybrid level of coordination for the scheduling problem in the downlink of a multi-cloud radio- access network, so as to benefit from both scheduling policies. Consider a multi-cloud radio access network, where each cloud is connected to several base-stations (BSs) via high capacity links, and therefore allows joint signal processing between them. Across the multiple clouds, however, only scheduling-level coordination is permitted, as it requires a lower level of backhaul communication. The frame structure of every BS is composed of various time/frequency blocks, called power- zones (PZs), and kept at fixed power level. The paper addresses the problem of maximizing a network-wide utility by associating users to clouds and scheduling them to the PZs, under the practical constraints that each user is scheduled, at most, to a single cloud, but possibly to many BSs within the cloud, and can be served by one or more distinct PZs within the BSs\\' frame. The paper solves the problem using graph theory techniques by constructing the conflict graph. The scheduling problem is, then, shown to be equivalent to a maximum- weight independent set problem in the constructed graph, in which each vertex symbolizes an association of cloud, user, BS and PZ, with a weight representing the utility of that association. Simulation results suggest that the proposed hybrid scheduling strategy provides appreciable gain as compared to the scheduling-level coordinated networks, with a negligible degradation to signal-level coordination.

In this paper, we present a new approach to the analysis and design of distributed control systems for multi-unit plants. The approach is established after treating the effect of recycled dynamics as a gap metric uncertainty from which a distributed controller can be designed sequentially for each unit to tackle the uncertainty. We then use a single effect multi-unit evaporation system to illustrate how the proposed method is used to analyze different control strategies and to systematically achieve a better closed-loop performance using a distributed model-based controller

Reduction of the weight of the propulsion system is important in the design of a stratospheric airship. However, it also important to increaseefficiency of the system because available energy generated by solar cells on the hull is quite limited. One solution to increase efficiency of the propulsion system is to use a stern propeller, the propeller mounted on the stern of the hull as shown in Figure 1. Mounted on the stern of the hull, the stern propeller is merged with the boundary layer of ...

Physical activity may reduce the risk of adverse pregnancy outcomes; however, compared to non-pregnant women, a lower proportion of pregnant women meet the physical activity guidelines. Our objectives were to explore overall changes and ethnic differences in objectively recorded moderate-to-vigorous intensity physical activity (MVPA) during pregnancy and postpartum and to investigate the associations with objective and perceived access to recreational areas. We analysed 1,467 person-observations from 709 women in a multi-ethnic population-based cohort, with MVPA data recorded with the SenseWear™ Pro(3) Armband in early pregnancy (mean gestational week (GW) 15), mid-pregnancy (mean GW 28) and postpartum (mean postpartum week 14). MVPA was limited to bouts ≥10 min. Women were nested within 56 neighbourhoods defined by postal code area. We derived neighbourhood-level objective access to recreational areas (good vs limited) by geographic information systems. We collected information about perceived access (high vs low perception) to recreational areas in early pregnancy. We treated ethnicity, objective and perceived access as explanatory variables in separate models based on linear mixed effects regression analyses. Overall, MVPA dropped between early and mid-pregnancy, followed by an increase postpartum. Western women performed more MVPA than women in other ethnic groups across time points, but the differences increased postpartum. Women residing in neighbourhoods with good objective access to recreational areas accumulated on average nine additional MVPA minutes/day (p perceived and objective access to recreational areas remained significantly associated with MVPA. The association between MVPA and access to recreational areas did not differ by time point, ethnic group or socio-economic position. In all ethnic groups, we observed a decline in MVPA between early and mid-pregnancy. However, at both time points during pregnancy, and especially three months

The objective of this evaluation is to provide recommendations to ensure consistency between the technical baseline requirements, baseline design, and the as-constructed Access Roads. Recommendations for resolving discrepancies between the as-constructed system, the technical baseline requirements, and the baseline design are included in this report. Cost and Schedule estimates are provided for all recommended modifications. This report does not address items which do not meet current safety or code requirements. These items are identified to the CMO and immediate action is taken to correct the situation. The report does identify safety and code items for which the A/E is recommending improvements. The recommended improvements will exceed the minimum requirements of applicable code and safety guide lines. These recommendations are intended to improve and enhance the operation and maintenance of the facility

Minimum wage policies have been advanced as mechanisms to improve the economic conditions of the working poor. Both positive and negative effects of such policies on health care access have been hypothesized, but associations have yet to be thoroughly tested. To examine whether the presence of minimum wage policies in excess of the federal standard of $5.15 per hour was associated with health care access indicators among low-skilled adults of working age, a cross-sectional analysis of 2004 Behavioral Risk Factor Surveillance System data was conducted. Self-reported health insurance status and experience with cost-related barriers to needed medical care were adjusted in multi-level logistic regression models to control for potential confounding at the state, county, and individual levels. State-level wage policy was not found to be associated with insurance status or unmet medical need in the models, providing early evidence that increased minimum wage rates may neither strengthen nor weaken access to care as previously predicted.

Barriers to expanding access to medicines include weak pharmaceutical sector governance, lack of transparency and accountability, inadequate attention to social services on the political agenda, and financing challenges. Multi-stakeholder initiatives such as the Medicines Transparency Alliance (MeTA) may help overcome these barriers. Between 2008 and 2015, MeTA engaged stakeholders in the pharmaceutical sectors of seven countries (Ghana, Jordan, Kyrgyzstan, Peru, Philippines, Uganda, and Zambia) to promote access goals through greater transparency. We reviewed archival data to document MeTA activities and results related to transparency and accountability in the seven countries where it was implemented. We identified common themes and content areas, noting specific activities used to make information transparent and accessible, how data were used to inform discussions, and the purpose and timing of meetings and advocacy activities to help set priorities and influence governance decisions. The cross-case analysis looked for pathways which might link the MeTA strategies to results such as better policies or program improvements. Countries used evidence gathering, open meetings, and proactive information dissemination to increase transparency. MeTA fostered policy dialogue to bring together the many government, civil society and private company stakeholders concerned with access issues, and provided them with information to understand barriers to access at policy, organizational, and community levels. We found strong evidence that transparency was enhanced. Some evidence suggests that MeTA efforts contributed to new policies and civil society capacity strengthening although the impact on government accountability is not clear. MeTA appears to have achieved its goal of creating a multi-stakeholder shared policy space in which government, civil society, and private sector players can come together and have a voice in the national pharmaceutical policy making process

This paper presents a general optimization methodology that merges game theory and multi-state system survivability theory. The defender has multiple alternatives of defense strategy that presumes separation and protection of system elements. The attacker also has multiple alternatives of its attack strategy based on a combination of different possible attack actions against different groups of system elements. The defender minimizes, and the attacker maximizes, the expected damage caused by the attack (taking into account the unreliability of system elements and the multi-state nature of complex series-parallel systems). The problem is defined as a two-period minmax non-cooperative game between the defender who moves first and the attacker who moves second. An exhaustive minmax optimization algorithm is presented based on a double-loop genetic algorithm for determining the solution. A universal generating function technique is applied for evaluating the losses caused by system performance reduction. Illustrative examples with solutions are presented

The tropical rainforest of Amazonia is one of the most species-rich ecosystems on earth, with an estimated 16000 tree species. Due to this high diversity, botanical identification of trees in the Amazon is difficult, even to genus, often requiring the assistance of parataxonomists or taxonomic specialists. Advances in informatics tools offer a promising opportunity to develop user-friendly electronic keys to improve Amazonian tree identification. Here, we introduce an original multi-access electronic key for the identification of 389 tree genera occurring in French Guiana terra-firme forests, based on a set of 79 morphological characters related to vegetative, floral and fruit characters. Its purpose is to help Amazonian tree identification and to support the dissemination of botanical knowledge to non-specialists, including forest workers, students and researchers from other scientific disciplines. The electronic key is accessible with the free access software Xper ², and the database is publicly available on figshare: https://figshare.com/s/75d890b7d707e0ffc9bf (doi: 10.6084/m9.figshare.2682550).

Full Text Available Natural and human-made systems abound around us. Our solar system, the human body, the food chain, and ecosystems are some examples of natural systems. Some human-made systems are transportation systems, weapon systems, computer systems, software systems, satellite communications systems, ships, missile defense systems, health care systems, the internet, financial systems, and regional economies. Understanding of natural systems is essential to the survival of the human species, which is intertwined with the survival of other species on earth. Having the knowledge and ability to build human-made systems is critical to the employment of systems that effectively serve the needs of their users. To gain such understanding and to acquire such knowledge and ability, it is necessary that cutting-edge research in systems science, systems engineering, and systems-related fields continue. This open access journal aims to achieve quick and global dissemination of results of such research. [...

of the Multi-state Weighted k-out-of-n: G System- the Multi-state Weighted k-out-of-n: F System has not been clearly defined and discussed. In this short communication, the basic definition of the Multi-state Weighted k-out-of-n: F System model is proposed. The relationship between the Multi-state Weighted k...

In recent years, biocatalysis has started to provide an important green tool in synthetic organic chemistry. Currently, the idea of using multi-enzymatic systems for industrial production of chemical compounds becomes increasingly attractive. Recent examples demonstrate the potential of enzymatic...... synthesis and fermentation as an alternative to chemical-catalysis for the production of pharmaceuticals and fine chemicals. In particular, the use of multiple enzymes is of special interest. However, many challenges remain in the scale-up of a multi-enzymatic system. This review summarizes and discusses...... the technology options and strategies that are available for the development of multi-enzymatic processes. Some engineering tools, including kinetic models and operating windows, for developing and evaluating such processes are also introduced....

Full Text Available In this work, we present a multi-camera surveillance system based on the use of self-organizing neural networks to represent events on video. The system processes several tasks in parallel using GPUs (graphic processor units. It addresses multiple vision tasks at various levels, such as segmentation, representation or characterization, analysis and monitoring of the movement. These features allow the construction of a robust representation of the environment and interpret the behavior of mobile agents in the scene. It is also necessary to integrate the vision module into a global system that operates in a complex environment by receiving images from multiple acquisition devices at video frequency. Offering relevant information to higher level systems, monitoring and making decisions in real time, it must accomplish a set of requirements, such as: time constraints, high availability, robustness, high processing speed and re-configurability. We have built a system able to represent and analyze the motion in video acquired by a multi-camera network and to process multi-source data in parallel on a multi-GPU architecture.

The authors report on the design and performance of a multichannel waveform digitizer system for use with the Multiple Sample Ionization Chamber (MUSIC) Detector at the Bevalac. 128 channels of 20 MHz Flash ADC plus 256 word deep memory are housed in a single crate. Digital thresholds and hit pattern logic facilitate zero suppression during readout which is performed over a standard VME bus

In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sume. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application. way

In this paper three new statistical theorems are demonstrated and applied. These theorems simplify very much the obtention of the formulae to compute the counting efficiency when the detection system is formed by several photomultipliers associated in coincidence and sum. These theorems are applied to several photomultiplier arrangements in order to show their potential and the application way. (Author) 6 refs

We propose a microcomputer system which allows parallel processing for Monte Carlo calculations in lattice gauge theories, simulations of high energy physics experiments and presumably many other fields of current interest. The master-n-slave multiprocessor system is based on the Motorola MC 68000 microprocessor. One attraction if this processor is that it allows up to 16 M Byte random access memory. (orig.)

Full Text Available The ability to connect to the Internet from a wide variety of devices such as smart phones, IoT devices and desktops at anytime and anywhere, produces a large number of e-commerce transactions, such as purchases of clothes, ticket entrances for performances, or banking operations. The increasing number of these transactions has also created an increase in the number of threats and attacks by third parties to access user data banks. It is important to control the access procedure to user data so that the number of threats does not continue to grow. To do so, it is necessary to prevent unauthorized access, theft and fraud in electronic commerce, which is required to ensure the safety of these transactions. Many e-commerce platforms are developed through multi-agent-systems because they include certain advantages to control the product, resource management, task distribution, etc. However, there are a number of threats that can jeopardize the safety of the agents that make up the system. These issues must be taken into account in the development of these multi-agent systems. However, existing methods of development do not cover in depth the issue of security. It is necessary to present and classify the potential security flaws of multi-agent systems. Therefore, the present research presents a review of the main vulnerabilities that occur in multi-agent systems responsible for managing e-commerce applications, as well as the proposed solutions to the major security problems on these platform systems. The main conclusions provided by this research is the need to optimize security measures and enhance the different security solutions applied in e-commerce applications in order to prevent identity theft, access to private data, access control, etc. It is therefore essential to continue to develop the security methods employed in applications such as e-commerce as different types of attacks and threats continue to evolve.

Full Text Available We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems as multi-level networks is more realistic. In particular, the software networks are organized by three levels of granularity, which represents the modularity and hierarchy in the formation process of real-world software systems. More importantly, simulations based on this model have generated more realistic structural properties of software networks, such as power-law, clustering and modularization. On the basis of this model, how the structure of software systems effects software design principles is then explored, and it could be helpful for understanding software evolution and software engineering practices.

Full Text Available This paper aims to present a multiagent system for network management. The models developed for the proposed system defines certain intelligent agents interact to achieve the objectives and requirements of the multiagent organization.These agents have the property of being adaptive, acquire knowledge and skills to make decisions according to the actual state of the network that is represented in the information base, MIB, SNMP devices. The ideal state of the network policy is defined by the end user entered, which contain the value that should have performance variables and other parameters such as the frequency with which these variables should be monitored.. An agent based architecture increase the integration, adaptability, cooperation, autonomy and the efficient operation in heterogeneous environment in the network supervision.

Full Text Available This paper aims to present a multiagent system for network management. The models developed for the proposed system defines certain intelligent agents interact to achieve the objectives and requirements of the multiagent organization.These agents have the property of being adaptive, acquire knowledge and skills to make decisions according to the actual state of the network that is represented in the information base, MIB, SNMP devices. The ideal state of the network policy is defined by the end user entered, which contain the value that should have performance variables and other parameters such as the frequency with which these variables should be monitored.. An agent based architecture increase the integration, adaptability, cooperation, autonomy and the efficient operation in heterogeneous environment in the network supervision.

The measuring system can be used for recording gamma spectra and/or experimental beta-dispersion. Several environmental samples can be examined simultaneously, and the instrument can be used in the laboratory or in the field. Low cost multichannel analyzers using NaI(Tl) or plastic scintillators are interfaced to an IBM PC/AT, which controls the measurement, data processing, and data transmission and archiving. (M.D.)

This chapter presents the history of the application of logic in a quite popular paradigm in contemporary computer science and artificial intelligence, viz. the area of intelligent agents and multi-agent systems. In particular we discuss the logics that have been used to specify single agents, the

In this paper we determine a simple inventory control rule for multi-echelon distribution systems under periodic review without lot sizing. The primary focus is the two-echelon model with a stockless central depot, but several extensions (> 2 echelons, central stock allowed) are discussed as well.

We consider multi-class systems of interacting nonlinear Hawkes processes modeling several large families of neurons and study their mean field limits. As the total number of neurons goes to infinity we prove that the evolution within each class can be described by a nonlinear limit differential...

A multi-function low solid angle system for direct and indirect measurement of radioactivity or emission rate of most α, β and EC emitting nuclides are described in this paper. The measurement result of 241 Am and 90 Sr- 90 Y are given

Purpose The aim of this paper is to test the moderating role of emotion regulation in the transformation of both task and process conflict into relationship conflict. Design/methodology/approach A field study of multi-teams systems, in which (94) respondents are engaged in interpersonal and

In many multi-echelon inventory systems the lead times are random variables. A common and reasonable assumption in most models is that replenishment orders do not cross, which implies that successive lead times are correlated. However, the process which generates such lead times is usually not

In many multi-echelon inventory systems, the lead times are random variables. A common and reasonable assumption in most models is that replenishment orders do not cross, which implies that successive lead times are correlated. However, the process that generates such lead times is usually not well

In IEEE 802.11 networks, the access point (AP) selection based on the strongest signal strength often results in the extremely unfair bandwidth allocation among mobile users (MUs). In this paper, we propose a distributed AP selection algorithm to achieve a fair bandwidth allocation for MUs. The proposed algorithm gradually balances the AP loads based on max-min fairness for the available multiple bit rate choices in a distributed manner. We analyze the stability and overhead of the proposed algorithm, and show the improvement of the fairness via computer simulation.

A radio frequency identification system having a radio frequency transceiver for generating a continuous wave RF interrogation signal that impinges upon an RF identification tag. An oscillation circuit in the RF identification tag modulates the interrogation signal with a subcarrier of a predetermined frequency and modulates the frequency-modulated signal back to the transmitting interrogator. The interrogator recovers and analyzes the subcarrier signal and determines its frequency. The interrogator generates an output indicative of the frequency of the subcarrier frequency, thereby identifying the responding RFID tag as one of a "class" of RFID tags configured to respond with a subcarrier signal of a predetermined frequency.

The NuDat program provides a user with access to nuclear properties and to some nuclear reaction data. The program operates on DEC VMS operating systems and on PC's with Microsoft operating systems. The program has four user interfaces, all having the same content and functionality. These interfaces are Web, Video and Sequential for VMS. The PC interface is identical to the VMS Video interface. Forms are used to supply the type of data the user desires, the retrieval parameters, the output format, and the sort order of the data. The program and associated database is used in basic research, particularly for the systematic study of nuclear properties. It is also a useful tool for applied research to identify radiations from radionuclides contained in environmental samples, or from those produced by neutron or charged particle activation. The NuDat database is derived from several databases maintained by the National Nuclear Data Center. The databases are the Adopted Levels and Gammas data sets from ENSDF, the Nuclear Wallet Cards, Decay Radiations derived from ENSDF decay data sets processed by RADLIST, and Thermal Neutron Cross Sections

This patent describes an explosive detection system. It comprises a source of neutrons; a detector array comprising a plurality of gamma ray detectors, each of the gamma ray detectors providing a detection signal in the event a gamma ray is captured by the detector, and at least one neutron detector, the neutron detector providing a neutron detection signal in the event a neutron is captured by the neutron detector; means for irradiating an object being examined with neutrons from the neutron source and for positioning the detector array relative to the object so that gamma rays emitted from the elements within the object as a result of the neutron irradiation are detected by the gamma ray detectors of the detector array; and parallel distributed processing means responsive to the detection signals of the detector array for discriminating between objects carrying explosives and objects not carrying explosives, the parallel distributed processing means including an artificial neural system (ANS), the ANS having a parallel network of processors, each processor of the parallel network of processors, each processor of the parallel network of processors including means for receiving at least one input signal, and means for generating an output signal as a function of the at least one input signal

Abstract BMC Systems Biology is the first open access journal spanning the growing field of systems biology from molecules up to ecosystems. The journal has launched as more and more institutes are founded that are similarly dedicated to this new approach. BMC Systems Biology builds on the ongoing success of the BMC series, providing a venue for all sound research in the systems-level analysis of biology.

BMC Systems Biology is the first open access journal spanning the growing field of systems biology from molecules up to ecosystems. The journal has launched as more and more institutes are founded that are similarly dedicated to this new approach. BMC Systems Biology builds on the ongoing success of the BMC series, providing a venue for all sound research in the systems-level analysis of biology.

One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.

A review of existing information pertaining to spacecraft power processing systems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processing systems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations

A communication device includes a data source that generates data for transmission over a bus, and a data encoder that receives and encodes outgoing data. An encoder system receives outgoing data from a data source and stores the outgoing data in a first queue. An encoder encodes outgoing data with a header type that is based upon a header type indication from a controller and stores the encoded data that may be a packet or a data word with at least one layered header in a second queue for transmission. The device is configured to receive at a payload extractor, a packet protocol change command from the controller and to remove the encoded data and to re-encode the data to create a re-encoded data packet and placing the re-encoded data packet in the second queue for transmission.

The recent spectacular progress in modern nano-dimension semiconductor technology enabled implementation of a complete complex multi-processor system on a single chip (MPSoC), global networking and mobile wire-less communication, and facilitated a fast progress in these areas. New important...... accessible or distant) objects, installations, machines or devices, or even implanted in human or animal body can serve as examples. However, many of the modern embedded application impose very stringent functional and parametric demands. Moreover, the spectacular advances in microelectronics introduced...

Multi-context systems (MCSs) are an important framework for heterogeneous combinations of systems within the Semantic Web. In this paper, we propose generic constructions to achieve specific forms of interaction in a principled way, and systematize some useful techniques to work with ontologies w...... within an MCS. All these mechanisms are presented in the form of general-purpose design patterns. Their study also suggests new ways in which this framework can be further extended.......Multi-context systems (MCSs) are an important framework for heterogeneous combinations of systems within the Semantic Web. In this paper, we propose generic constructions to achieve specific forms of interaction in a principled way, and systematize some useful techniques to work with ontologies...

The transition from film camera to video surveillance systems, in particular the implementation of high capacity multi-camera video systems, results in a large increase in the amount of recorded scenes. Consequently, there is a substantial increase in the manpower requirements for review. Moreover, modern microprocessor controlled equipment facilitates the collection of additional data associated with each scene. Both the scene and the annotated information have to be evaluated by the inspector. The design of video surveillance systems for safeguards necessarily has to account for both appropriate recording and reviewing techniques. An aspect of principal importance is that the video information is stored on tape. Under the German Support Programme to the Agency a technical concept has been developed which aims at optimizing the capabilities of a multi-camera optical surveillance (MOS) system including the reviewing technique. This concept is presented in the following paper including a discussion of reviewing and reliability

An FMG (flexible multi-generation system) consists of integrated and flexibly operated facilities that provide multiple links between the various layers of the energy system. FMGs may facilitate integration and balancing of fluctuating renewable energy sources in the energy system in a cost...... is based on consideration of the following points: Selection, location and dimensioning of processes; systematic heat and mass integration; flexible operation optimization with respect to both short-term market fluctuations and long-term energy system development; global sensitivity and uncertainty...... analysis; biomass supply chains; variable part-load performance; and multi-objective optimization considering economic and environmental performance. Tested in a case study, the methodology is proved effective in screening the solution space for efficient FMG designs, in assessing the importance...

Full Text Available Paper presents a study under the Quality of Service in modern wireless sensor networks. Such a networks are characterized by small amount of data transmitted in fixed periods. Very often this data must by transmitted in real time so data transmission delays should be well known. This article shows multimode network simulated in packet OPNET Modeler. Also nowadays the quality of services is very important especially in multi-nodes systems such a home automation or measurement systems.

In this work, we present a multi-camera surveillance system based on the use of self-organizing neural networks to represent events on video. The system processes several tasks in parallel using GPUs (graphic processor units). It addresses multiple vision tasks at various levels, such as segmentation, representation or characterization, analysis and monitoring of the movement. These features allow the construction of a robust representation of the environment and interpret the behavior of mob...

The author introduces the technological process of a multi-purpose radwaste incineration system. It is composed of three parts: pretreatment, incinerating and clean up of off-gas. The waste that may be treated include combustible solid waste, spent resins and oils. Technological routes of the system is pyrolysis incinerating for solid waste, spray incinerating for spent oils, combination of dry-dust removing and wet adsorption for cleaning up off-gas

The dynamic channel allocation (DCA) scheme in multi-cell systems causes serious inter-cell interference (ICI) problem to some existing calls when channels for new calls are allocated. Such a problem can be addressed by advanced centralized DCA design that is able to minimize ICI. Thus, in this paper, a centralized DCA is developed for the downlink of multi-cell orthogonal frequency division multiple access (OFDMA) systems with full spectral reuse. However, in practice, as the search space of channel assignment for centralized DCA scheme in multi-cell systems grows exponentially with the increase of the number of required calls, channels, and cells, it becomes an NP-hard problem and is currently too complicated to find an optimum channel allocation. In this paper, we propose an ant colony optimization (ACO) based DCA scheme using a low-complexity ACO algorithm which is a kind of heuristic algorithm in order to solve the aforementioned problem. Simulation results demonstrate significant performance improvements compared to the existing schemes in terms of the grade of service (GoS) performance and the forced termination probability of existing calls without degrading the system performance of the average throughput.

Although the first Multi-pixel Hybrid Photodiode (M-HPD) was developed in the early 1990s by Delft Electronic Products, the main obstacle to its application has been the lack of availability of a compact read-out system. A fast, parallel readout system has been constructed for use with the earlier 25-pixel tube with High-energy Physics applications in mind. The excellent properties of the recently developed multi-pixel hybrid photodiodes (M-HPD) will be easier to exploit following the development of the new hybrid read-out circuits described in this paper. This system will enable all of the required read-out functions to be accommodate on a single board into which the M-HPD is plugged. The design and performance of a versatile system is described in which a trigger-signal, derived from the common-side of the silicon anode in the M-HPD, is used to trigger the readout of the 60-anode pixels in the M-HPD. The multi-channel amplifier section is based on the use of a new, commercial VLSI chip, whilst the read-out sequencer uses a chip of its own design. The common anode signal is processed by a fast amplifier and discriminator to provide a trigger signal when a single event is detected. In the prototype version, the serial analogue output data-stream is processed using a PC-mounted, high speed ADC. Results obtained using the new read-out system in a compact gamma-camera and with a small muon tracking-chamber demonstrate the low-noise performance of the system. The application of this read-out system in other position-sensitive or multi-anode photomultiplier tube applications are also described

Model coupling is increasingly used as a method of combining the best of two models when representing socio-environmental systems, though barriers to successful model adoption by stakeholders are particularly present with the use of coupled models, due to their high complexity and typically low implementation flexibility. Coupled system dynamics - physically-based modelling is a promising method to improve stakeholder participation in environmental modelling while retaining a high level of complexity for physical process representation, as the system dynamics components are readily understandable and can be built by stakeholders themselves. However, this method is not without limitations in practice, including 1) inflexible and complicated coupling methods, 2) difficult model maintenance after the end of the project, and 3) a wide variety of end-user cultures and languages. We have developed the open-source Python-language software tool Tinamit to overcome some of these limitations to the adoption of stakeholder-based coupled system dynamics - physically-based modelling. The software is unique in 1) its inclusion of both a graphical user interface (GUI) and a library of available commands (API) that allow users with little or no coding abilities to rapidly, effectively, and flexibly couple models, 2) its multilingual support for the GUI, allowing users to couple models in their preferred language (and to add new languages as necessary for their community work), and 3) its modular structure allowing for very easy model coupling and modification without the direct use of code, and to which programming-savvy users can easily add support for new types of physically-based models. We discuss how the use of Tinamit for model coupling can greatly increase the accessibility of coupled models to stakeholders, using an example of a stakeholder-built system dynamics model of soil salinity issues in Pakistan coupled with the physically-based soil salinity and water flow model

This paper investigates low-complexity joint interference avoidance and desired link improvement for single channel allocation in multiuser multi-antenna access points (APs) for open-access small cells. It is considered that an active user is equipped with an atenna array that can be used to suppress interference sources but not to provide spatial diversity. On the other hand, the operation of APs can be coordinated to meet design requirements, and each of which can unconditionally utilize assigned physical channels. Moreover, each AP is equipped with uncorrelated antennas that can be reused simultaneously to serve many active users. The analysis provides new approaches to exploit physical channels, transmit antennas, and APs to mitigate interference, while providing the best possible link gain to an active user through the most suitable interference-free channel. The event of concurrent service requests placed by active users on a specific interference-free channel is discussed for either interference avoidance through identifying unshared channels or desired link improvement via multiuser scheduling. The applicability of the approaches to balance downlink loads is explained, and practical scenarios due to imperfect identification of interference-free channels and/or scheduled user are thoroughly investigated. The developed results are applicable for any statistical and geometric models of the allocated channel to an active user as well as channel conditions of interference users. They can be used to study various performance measures. Numerical and simulation results are presented to explain some outcomes of this work.

This paper investigates low-complexity joint interference avoidance and desired link improvement for single channel allocation in multiuser multi-antenna access points (APs) for open-access small cells. It is considered that an active user is equipped with an atenna array that can be used to suppress interference sources but not to provide spatial diversity. On the other hand, the operation of APs can be coordinated to meet design requirements, and each of which can unconditionally utilize assigned physical channels. Moreover, each AP is equipped with uncorrelated antennas that can be reused simultaneously to serve many active users. The analysis provides new approaches to exploit physical channels, transmit antennas, and APs to mitigate interference, while providing the best possible link gain to an active user through the most suitable interference-free channel. The event of concurrent service requests placed by active users on a specific interference-free channel is discussed for either interference avoidance through identifying unshared channels or desired link improvement via multiuser scheduling. The applicability of the approaches to balance downlink loads is explained, and practical scenarios due to imperfect identification of interference-free channels and/or scheduled user are thoroughly investigated. The developed results are applicable for any statistical and geometric models of the allocated channel to an active user as well as channel conditions of interference users. They can be used to study various performance measures. Numerical and simulation results are presented to explain some outcomes of this work.

Full Text Available The aim is to prove the qualification system of access control systems (ACS as an information system for personal data (ISPDn. Applications: systems of physical protection of facilities.

Full Text Available Purpose: This paper examines the behaviour of shared and dedicated Kanban allocation policies of Hybrid Kanban-CONWIP and Basestock-Kanban-CONWIP control strategies in multi-product systems; with considerations to robustness of optimal solutions to environmental and system variabilities. Design/methodology/approach: Discrete event simulation and evolutionary multi-objective optimisation approach were utilised to develop Pareto-frontier or sets of non-dominated optimal solutions and for selection of an appropriate decision set for the control parameters in the shared Kanban allocation policy (S-KAP and dedicated Kanban allocation policy (D-KAP. Simulation experiments were carried out via ExtendSim simulation application software. The outcomes of PCS+KAP performances were compared via all pairwise comparison and Nelson’s screening and selection procedure for superior PCS+KAP under negligible environmental and system stability. To determine superior PCS+KAP under systems’ and environmental variability, the optimal solutions were tested for robustness using Latin hypercube sampling technique and stochastic dominance test. Findings: The outcome of this study shows that under uncontrollable environmental variability, dedicated Kanban allocation policy outperformed shared Kanban allocation policy in serial manufacturing system with negligible and in complex assembly line with setup times. Moreover, the BK-CONWIP is shown as superior strategy to HK-CONWIP. Research limitations/implications: Future research should be conducted to verify the level of flexibility of BK-CONWIP with respect to product mix and product demand volume variations in a complex multi-product system Practical implications: The outcomes of this work are applicable to multi-product manufacturing industries with significant setup times and systems with negligible setup times. The multi-objective optimisation provides decision support for selection of control-parameters such that

Full Text Available This paper is based on doctoral dissertation which is oriented on improving environmental management system using multi - software. In this doctoral dissertation will be used key results of master thesis which is oriented on quantification environmental aspects and impacts by artificial neural network in organizations. This paper recommend improving environmental management system in organization using Balanced scorecard model and MCDM method - AHP (Analytic hierarchy process based on group decision. BSC would be spread with elements of Environmental management system and used in area of strategic management system in organization and AHP would be used in area of checking results getting by quantification environmental aspects and impacts.

Recent Advances in System Reliability discusses developments in modern reliability theory such as signatures, multi-state systems and statistical inference. It describes the latest achievements in these fields, and covers the application of these achievements to reliability engineering practice. The chapters cover a wide range of new theoretical subjects and have been written by leading experts in reliability theory and its applications. The topics include: concepts and different definitions of signatures (D-spectra), their properties and applications to reliability of coherent systems and network-type structures; Lz-transform of Markov stochastic process and its application to multi-state system reliability analysis; methods for cost-reliability and cost-availability analysis of multi-state systems; optimal replacement and protection strategy; and statistical inference. Recent Advances in System Reliability presents many examples to illustrate the theoretical results. Real world multi-state systems...

The use of oblique imagery has become a standard for many civil and mapping applications, thanks to the development of airborne digital multi-camera systems, as proposed by many companies (Blomoblique, IGI, Leica, Midas, Pictometry, Vexcel/Microsoft, VisionMap, etc.). The indisputable virtue of oblique photography lies in its simplicity of interpretation and understanding for inexperienced users allowing their use of oblique images in very different applications, such as building detection and reconstruction, building structural damage classification, road land updating and administration services, etc. The paper reports an overview of the actual oblique commercial systems and presents a workflow for the automated orientation and dense matching of large image blocks. Perspectives, potentialities, pitfalls and suggestions for achieving satisfactory results are given. Tests performed on two datasets acquired with two multi-camera systems over urban areas are also reported.

Full Text Available Aiming at high-cost, large-size, and inflexibility problems of traditional analog intermediate frequency receiver in the aerospace telemetry, tracking, and command (TTC system, we have proposed a new intermediate frequency (IF digital receiver based on Multi-FPGA system in this paper. Digital beam forming (DBF is realized by coordinated rotation digital computer (CORDIC algorithm. An experimental prototype has been developed on a compact Multi-FPGA system with three FPGAs to receive 16 channels of IF digital signals. Our experimental results show that our proposed scheme is able to provide a great convenience for the design of IF digital receiver, which offers a valuable reference for real-time, low power, high density, and small size receiver design.

The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.

pose a big challenge to current verification methodologies, due to the explosion of state space size as soon as large, if not medium sized, multi-station systems have to be controlled. For these reasons, verification techniques that exploit locality principles related to the topological layout...... of the controlled system to split in different ways the state space have been investigated. In particular, compositional approaches divide the controlled track network in regions that can be verified separately, once proper assumptions are considered on the way the pieces are glued together. Basing on a successful...... method to verify the size of rather large networks, we propose a compositional approach that is particularly suitable to address multi-station interlocking systems which control a whole line composed of stations linked by mainline tracks. Indeed, it turns out that for such networks, and for the adopted...

After having recalled the formal convergence of the semi-classical multi-species Boltzmann equations toward the multi-species Euler system (i.e. mixture of gases having the same velocity), we generalize to this system the closure relations proposed by B. Despres and by F. Lagoutiere for the multi-components Euler system (i.e. mixture of non miscible fluids having the same velocity). Then, we extend the energy relaxation schemes proposed by F. Coquel and by B. Perthame for the numerical resolution of the mono-species Euler system to the multi-species isothermal Euler system and to the multi-components isobar-isothermal Euler system. This allows to obtain a class of entropic schemes under a CFL criteria. In the multi-components case, this class of entropic schemes is perhaps a way for the treatment of interface problems and, then, for the treatment of the numerical mixture area by using a Lagrange + projection scheme. Nevertheless, we have to find a good projection stage in the multi-components case. At last, in the last chapter, we discuss, through the study of a dynamical system, about a system proposed by R. Abgrall and by R. Saurel for the numerical resolution of the multi-components Euler system.

This paper describes an energy system that is designed to meet the demands of rural populations that currently have no access to grid-connected electricity. Besides electricity, it is well recognized that rural populations need at least a centralized refrigeration system for storage of medicines and other emergency supplies, as well as safe drinking water. Here we propose a district system that will employ a multi-generation concentrated solar power (CSP) system that will generate electricity and supply the heat needed for both absorption refrigeration and membrane distillation (MD) water purification. The electricity will be used to generate hydrogen through highly efficient water electrolysis and individual households can use the hydrogen for generating electricity, via affordable proton exchange membrane (PEM) fuel cells, and as a fuel for cooking. The multi-generation system is being developed such that its components will be easy to manufacture and maintain. As a result, these components will be less efficient than their typical counterparts but their low cost-to-efficiency ratio will allow for us to meet our installation cost goal of $1/Watt for the entire system. The objective of this paper is to introduce the system concept and discuss the system components that are currently under development. (auth)

When an important process of a molecular system occurs via a combination of two or more rare events, which occur almost independently to one another, computational sampling for the important process is difficult. Here, to sample such a process effectively, we developed a new method, named the "multi-dimensional Virtual-system coupled Monte Carlo (multi-dimensional-VcMC)" method, where the system interacts with a virtual system expressed by two or more virtual coordinates. Each virtual coordinate controls sampling along a reaction coordinate. By setting multiple reaction coordinates to be related to the corresponding rare events, sampling of the important process can be enhanced. An advantage of multi-dimensional-VcMC is its simplicity: Namely, the conformation moves widely in the multi-dimensional reaction coordinate space without knowledge of canonical distribution functions of the system. To examine the effectiveness of the algorithm, we introduced a toy model where two molecules (receptor and its ligand) bind and unbind to each other. The receptor has a deep binding pocket, to which the ligand enters for binding. Furthermore, a gate is set at the entrance of the pocket, and the gate is usually closed. Thus, the molecular binding takes place via the two events: ligand approach to the pocket and gate opening. In two-dimensional (2D)-VcMC, the two molecules exhibited repeated binding and unbinding, and an equilibrated distribution was obtained as expected. A conventional canonical simulation, which was 200 times longer than 2D-VcMC, failed in sampling the binding/unbinding effectively. The current method is applicable to various biological systems.

Multiple biological structures have demonstrated fog collection abilities, such as beetle backs with bumps and spider silks with periodic spindle-knots and joints. Many Cactaceae species live in arid environments and are extremely drought-tolerant. Here we report that one of the survival systems of the cactus Opuntia microdasys lies in its efficient fog collection system. This unique system is composed of well-distributed clusters of conical spines and trichomes on the cactus stem; each spine contains three integrated parts that have different roles in the fog collection process according to their surface structural features. The gradient of the Laplace pressure, the gradient of the surface-free energy and multi-function integration endow the cactus with an efficient fog collection system. Investigations of the structure-function relationship in this system may help us to design novel materials and devices to collect water from fog with high efficiencies.

Multiple biological structures have demonstrated fog collection abilities, such as beetle backs with bumps and spider silks with periodic spindle-knots and joints. Many Cactaceae species live in arid environments and are extremely drought-tolerant. Here we report that one of the survival systems of the cactus Opuntia microdasys lies in its efficient fog collection system. This unique system is composed of well-distributed clusters of conical spines and trichomes on the cactus stem; each spine contains three integrated parts that have different roles in the fog collection process according to their surface structural features. The gradient of the Laplace pressure, the gradient of the surface-free energy and multi-function integration endow the cactus with an efficient fog collection system. Investigations of the structure–function relationship in this system may help us to design novel materials and devices to collect water from fog with high efficiencies. PMID:23212376

Switched Multi-megabit Data Service (SMDS) is a proposed high-speed packet-switched service which will support broadband applications such as Local Area Network (LAN) interconnections across a metropolitan area and beyond. This service is designed to take advantage of evolving Metropolitan Area Network (MAN) standards and technology which will provide customers with 45-mbps and 1 . 5-mbps access to high-speed public data communications networks. This paper will briefly discuss SMDS and review its architecture including the Subscriber Network Interface (SNI) and the SMDS Interface Protocol (SIP). It will review the fundamental features of SMDS such as address screening addressing scheme and access classes. Then it will describe the SMDS prototype system developed in-house by NYNEX Science Technology.

The NUDAT program with its associated database provides access to nuclear properties and some nuclear reaction data. The program has interfaces for WWW, Telnet online access, and PC. The database contains the following information: level and gamma-ray adopted properties from ENSDF; nuclear ground and metastable state properties; radioactive decay radiations from ENSDF; thermal neutron cross sections and resonance integrals as published in 'Neutron Cross Sections', Vol. 1. The online version is accessible through the IAEA's WWW site or through the Telnet online service NDIS, the PC version is available by FTP or on CD-ROM. (author)

Recent studies on cloud-radio access networks (CRANs) assume the availability of a single processor (cloud) capable of managing the entire network performance; inter-cloud interference is treated as background noise. This paper considers the more practical scenario of the downlink of a CRAN formed by multiple clouds, where each cloud is connected to a cluster of multiple-antenna base stations (BSs) via high-capacity wireline backhaul links. The network is composed of several disjoint BSs\\' clusters, each serving a pre-known set of single-antenna users. To account for both inter- cloud and intra-cloud interference, the paper considers the problem of minimizing the total network power consumption subject to quality of service constraints, by jointly determining the set of active BSs connected to each cloud and the beamforming vectors of every user across the network. The paper solves the problem using Lagrangian duality theory through a dual decomposition approach, which decouples the problem into multiple and independent subproblems, the solution of which depends on the dual optimization problem. The solution then proceeds in updating the dual variables and the active set of BSs at each cloud iteratively. The proposed approach leads to a distributed implementation across the multiple clouds through a reasonable exchange of information between adjacent clouds. The paper further proposes a centralized solution to the problem. Simulation results suggest that the proposed algorithms significantly outperform the conventional per-cloud update solution, especially at high signal-to-interference-plus- noise ratio (SINR) target.

Introduction Many low-income parent/caregivers do not understand the importance of cavity-free primary (baby) teeth and the chronic nature of dental caries (tooth decay). As a consequence, dental preventive and treatment utilization is low even when children are screened in schools and referred for care. This study aims to test a referral letter and Dental Information Guide (DIG) designed using the Common-Sense Model of Self-Regulation (CSM) framework to improve caregivers’ illness perception of dental caries and increase utilization of care by children with restorative dental needs. Methods A multi-site randomized controlled trial with caregivers of Kindergarten to 4th grade children in urban Ohio and rural Washington State will compare five arms: (1) CSM referral letter alone; (2) CSM referral letter + DIG; (3) reduced CSM referral letter alone; (4) reduced CSM referral letter + DIG; (5) standard (control) referral. At baseline, children will be screened at school to determine restorative dental needs. If in need of treatment, caregivers will be randomized to study arms and an intervention packet will be sent home. The primary outcome will be dental care based on a change in oral health status by clinical examination 7 months post-screening (ICDAS sealant codes 1 and 2; restoration codes 3–8; extraction). Enrollment commenced summer 2015 with results in summer 2016. Conclusion This study uses the CSM framework to develop and test behavioral interventions to increase dental utilization among low-income caregivers. If effective this simple intervention has broad applicability in clinical and community-based settings. PMID:26500170

The Kepler mission and its successor K2 have brought forth a cascade of transiting planets. Many of these planetary systems exhibit multiple transiting members. However, a large fraction possesses only a single transiting planet. This high abundance of singles, dubbed the "Kepler Dichotomy," has been hypothesized to arise from significant mutual inclinations between orbits in multi-planet systems. Alternatively, the single-transiting population truly possesses no other planets in the system, but the true origin of the overabundance of single systems remains unresolved. In this work, we propose that planetary systems typically form with a coplanar, multiple-planetary architecture, but that quadrupolar gravitational perturbations from their rapidly-rotating host star subsequently disrupt this primordial coplanarity. We demonstrate that, given sufficient stellar obliquity, even systems beginning with 2 planetary constituents are susceptible to dynamical instability soon after planet formation, as a result of the stellar quadrupole moment. This mechanism stands as a widespread, yet poorly explored pathway toward planetary system instability. Moreover, by requiring that observed multi-systems remain coplanar on Gyr timescales, we are able to place upper limits on the stellar obliquity in systems such as K2-38 (obliquity < 20 degrees), where other methods of measuring spin-orbit misalignment are not currently available.

In this paper we describe the design, installation and first calibration tests of a Multi Optical Transition Radiation System in the beam diagnostic section of the Extraction (EXT) line of ATF2, close to the multi wire scanner system. This system will be a valuable tool for measuring beam sizes and emittances coming from the ATF Damping Ring. With an optical resolution of about 2 (micro)m an original OTR design (OTR1X) located after the septum at the entrance of the EXT line demonstrated the ability to measure a 5.5 (micro)m beam size in one beam pulse and to take many fast measurements. This gives the OTR the ability to measure the beam emittance with high statistics, giving a low error and a good understanding of emittance jitter. Furthermore the nearby wire scanners will be a definitive test of the OTR as a beam emittance diagnostic device. The multi-OTR system design proposed here is based on the existing OTR1X.

Proposed is a laser projection display system that uses an electronically controlled variable focus lens (ECVFL) to achieve sharp and in-focus image projection over multi-distance three-dimensional (3D) conformal screens. The system also functions as an embedded distance sensor that enables 3D mapping of the multi-level screen platform before the desired laser scanned beam focused/defocused projected spot sizes are matched to the different localized screen distances on the 3D screen. Compared to conventional laser scanning and spatial light modulator (SLM) based projection systems, the proposed design offers in-focus non-distorted projection over a multi-distance screen zone with varying depths. An experimental projection system for a screen depth variation of 65 cm is demonstrated using a 633 nm laser beam, 3 KHz scan speed galvo-scanning mirrors, and a liquid-based ECVFL. As a basic demonstration, an in-house developed MATLAB based graphic user interface is deployed to work along with the laser projection display, enabling user inputs like text strings or predefined image projection. The user can specify projection screen distance, scanned laser linewidth, projected text font size, projected image dimensions, and laser scanning rate. Projected images are shown highlighting the 3D control capabilities of the display, including the production of a non-distorted image onto two-depths versus a distorted image via dominant prior-art projection methods.

Full Text Available This paper presents the results of the project IMPERA (Integrated Mission Planning for Distributed Robot Systems. The goal of IMPERA was to realize an extraterrestrial exploration scenario using a heterogeneous multi-robot system. The main challenge was the development of a multi-robot planning and plan execution architecture. The robot team consists of three heterogeneous robots, which have to explore an unknown environment and collect lunar drill samples. The team activities are described using the language ALICA (A Language for Interactive Agents. Furthermore, we use the mission planning system pRoPhEt MAS (Reactive Planning Engine for Multi-Agent Systems to provide an intuitive interface to generate team activities. Therefore, we define the basic skills of our team with ALICA and define the desired goal states by using a logic description. Based on the skills, pRoPhEt MAS creates a valid ALICA plan, which will be executed by the team. The paper describes the basic components for communication, coordinated exploration, perception and object transportation. Finally, we evaluate the planning engine pRoPhEt MAS in the IMPERA scenario. In addition, we present further evaluation of pRoPhEt MAS in more dynamic environments.

The article explores in how far financial accessibility of healthcare (FAH) is restricted for low-income groups and identifies social protection policies that can supplement health policies in guaranteeing universal access to healthcare. The article is aimed to advance the literature on comparative European social epidemiology by focussing on income-related barriers of healthcare take-up. The research is carried out on the basis of multi-level cross-sectional analyses using 2012 EU-SILC data for 30 European countries. The social policy data stems from EU-SILC beneficiary information. It is argued that unmet medical needs are a reality for many individuals within Europe - not only due to direct user fees but also due to indirect costs such as waiting time, travel costs, time not spent working. Moreover, low FAH affects not only the lowest income quintile but also the lower middle income class. The study observes that social allowance increases the purchasing power of both household types, thereby helping them to overcome financial barriers to healthcare uptake. Alongside healthcare system reform aimed at improving the pro-poor availability of healthcare facilities and financing, policies directed at improving FAH should aim at providing a minimum income base to the low-income quintile. Moreover, categorical policies should address households exposed to debt which form the key vulnerable group within the low-income classes.

AMH is widely used for assessing ovarian reserve, and it is particularly convenient, because it is thought to have minimal variability throughout the menstrual cycle. However, studies assessing the stability of AMH over the menstrual cycle have been conflicting. The purpose of this study is to determine whether AMH levels vary across the normal menstrual cycle. A multi-center, prospective cohort study conducted at three US centers. Fifty females with regular menstrual cycles aged 18-45 underwent serial venipuncture every 3-5 days starting in the early follicular phase and lasting up to 10 collections. AMH was tested using the Access 2 immunoassay system. Age-adjusted mixed-effect models utilizing data from 384 samples from 50 subjects demonstrated a within subject standard deviation of 0.81 (95% CI 0.75-0.88) with a coefficient of variation of 23.8% across the menstrual cycle and between subject standard deviation of 2.56 (95% CI 2.13-3.21) with a coefficient of variation of 75.1%. Intra-class correlation (ICC) of AMH across the menstrual cycle was 0.91. Overall, AMH levels, using the automated Access AMH assay, appear to be relatively stable across the menstrual cycle. Fluctuations, if any, appear to be small, and therefore, clinicians may advise patients to have AMH levels drawn at any time in the cycle.

Inter-track interference is one of the most severe impairments in bit-patterned media recording system. This impairment can be effectively handled by a modulation code and a multi-head array jointly processing multiple tracks; however, such a modulation constraint has never been utilized to improve the soft-information. Therefore, this paper proposes the utilization of modulation codes with an encoded constraint defined by the criteria for soft-information flipping during a three-track data detection process. Moreover, we also investigate the optimal offset position of readheads to provide the most improvement in system performance. The simulation results indicate that the proposed systems with and without position jitter are significantly superior to uncoded systems.

The authors investigated the outcomes of measurements on correlated, few-body quantum systems described by a quaternionic quantum mechanics that allows for regions of quaternionic curvature. It was found that a multi particles interferometry experiment using a correlated system of four nonrelativistic, spin-half particles has the potential to detect the presence of quaternionic curvature. Two-body systems, however, are shown to give predictions identical to those of standard quantum mechanics when relative angles are used in the construction of the operators corresponding to measurements of particle spin components. 15 refs

A revolution in the space sector is happening. It is expected that in the next decade there will be more satellites launched than in the previous sixty years of space exploration. Major challenges are associated with this growth of space assets such as the autonomy and management of large groups of satellites, in particular with small satellites. There are two main objectives for this work. First, a flexible and distributed software architecture is presented to expand the possibilities of spacecraft autonomy and in particular autonomous motion in attitude and position. The approach taken is based on the concept of distributed software agents, also referred to as multi-agent robotic system. Agents are defined as software programs that are social, reactive and proactive to autonomously maximize the chances of achieving the set goals. Part of the work is to demonstrate that a multi-agent robotic system is a feasible approach for different problems of autonomy such as satellite attitude determination and control and autonomous rendezvous and docking. The second main objective is to develop a method to optimize multi-satellite configurations in space, also known as satellite constellations. This automated method generates new optimal mega-constellations designs for Earth observations and fast revisit times on large ground areas. The optimal satellite constellation can be used by researchers as the baseline for new missions. The first contribution of this work is the development of a new multi-agent robotic system for distributing the attitude determination and control subsystem for HiakaSat. The multi-agent robotic system is implemented and tested on the satellite hardware-in-the-loop testbed that simulates a representative space environment. The results show that the newly proposed system for this particular case achieves an equivalent control performance when compared to the monolithic implementation. In terms on computational efficiency it is found that the multi

Entanglement, the Einstein-Podolsky-Rosen (EPR) paradox and Bell's failure of local-hiddenvariable (LHV) theories are three historically famous forms of "quantum nonlocality". We give experimental criteria for these three forms of nonlocality in multi-particle systems, with the aim of better understanding the transition from microscopic to macroscopic nonlocality. We examine the nonlocality of N separated spin J systems. First, we obtain multipartite Bell inequalities that address the correlation between spin values measured at each site, and then we review spin squeezing inequalities that address the degree of reduction in the variance of collective spins. The latter have been particularly useful as a tool for investigating entanglement in Bose-Einstein condensates (BEC). We present solutions for two topical quantum states: multi-qubit Greenberger-Horne-Zeilinger (GHZ) states, and the ground state of a two-well BEC.

This thesis focuses on control and coordination of mobile multi-robot systems (MRS). MRS can often deal with tasks that are difficult to be accomplished by a single robot. One of the challenges is the need to control, coordinate and synchronize the operation of several robots to perform some...... specified task. This calls for new strategies and methods which allow the desired system behavior to be specified in a formal and succinct way. Two different frameworks for the coordination and control of MRS have been investigated. Framework I - A network of robots is modeled as a network of multi...... a requirement specification in Computational Tree Logic (CTL) for a network of robots. The result is a set of motion plans for the robots which satisfy the specification. Framework II - A framework for controller synthesis for a single robot with respect to requirement specification in Linear-time Temporal...

Access to higher education in Poland is changing due to the demography of smaller cohorts of potential students. Following a demand-driven educational expansion after the collapse of communism in 1989, the higher education system is now contracting. Such expansion/contraction and growth/decline in European higher education has rarely been…

Full Text Available This research studies the accessibility of grocery stores to university students using the public transportation system, drawing from a case study of Fargo, North Dakota. Taking into consideration the combined travel time components of walking, riding, and waiting, this study measures two types of accessibilities: accessibility to reach a particular place and accessibility to reach the bus stop to ride the public transit system. These two accessibilities are interdependent and cannot perform without each other. A new method to calculate the average accessibility measure for the transit routes is proposed. A step-wise case study analysis indicates that one route provides accessibility to a grocery store in eight minutes. This also suggests that the North Dakota State University area has moderate accessibility to grocery stores.

Cavity-based large scale quantum information processing (QIP) may involve multiple cavities and require performing various quantum logic operations on qubits distributed in different cavities. Geometric-phase-based quantum computing has drawn much attention recently, which offers advantages against inaccuracies and local fluctuations. In addition, multiqubit gates are particularly appealing and play important roles in QIP. We here present a simple and efficient scheme for realizing a multi-target-qubit unconventional geometric phase gate in a multi-cavity system. This multiqubit phase gate has a common control qubit but different target qubits distributed in different cavities, which can be achieved using a single-step operation. The gate operation time is independent of the number of qubits and only two levels for each qubit are needed. This multiqubit gate is generic, e.g., by performing single-qubit operations, it can be converted into two types of significant multi-target-qubit phase gates useful in QIP. The proposal is quite general, which can be used to accomplish the same task for a general type of qubits such as atoms, NV centers, quantum dots, and superconducting qubits.

In this paper, we study model-checking of linear-time properties in multi-valued systems. Safety property, invariant property, liveness property, persistence and dual-persistence properties in multi-valued logic systems are introduced. Some algorithms related to the above multi-valued linear-time properties are discussed. The verification of multi-valued regular safety properties and multi-valued $\\omega$-regular properties using lattice-valued automata are thoroughly studied. Since the law o...

This paper presents an agent-based approach, called delegate multi-agent systems, for anticipatory vehicle routing to avoid trafﬁc congestion. In this approach, individual vehicles are represented by agents, which themselves issue light-weight agents that explore alternative routes in the environment on behalf of the vehicles. Based on the evaluation of the alternatives, the vehicles then issue light-weight agents for allocating road segments, spreading the vehicles’ intentions and coordi...

A method that determines the minimum bracing stiffness required by a multi-column elastic system to achieve non-sway buckling conditions is proposed. Equations that evaluate the required minimum stiffness of the lateral and torsional bracings and the corresponding “braced" critical buckling load for each column of the story level are derived using the modified stability functions. The following effects are included: 1) the types of end connections (rigid, semirigid, and simple); 2) the bluepr...

Multi-agent systems (MAS) composed of autonomous agents representing individuals or organizations and capable of reaching mutually beneficial agreements through negotiation and argumentation are becoming increasingly important and pervasive.Research on both automated negotiation and argumentation in MAS has a vigorous, exciting tradition. However, efforts to integrate both areas have received only selective attention in the academia and the practitioner literature. A symbiotic relationship could significantly strengthen each area's progress and trigger new R&D challenges and prospects toward t

using the PC-based Windows NT environment. Results: The developed multi-system interface module accesses and shares data from a commercial CT-simulator, a research-based treatment planning system, and a commercial radiation oncology information system in the departmental wide-area network (WAN). The software tool shares the CT-simulator's anatomical contours, images, and plan information with the treatment planning system which eliminates the need for the oncologist to redraw the tumor volumes or custom blocks. The plan and treatment information is updated in the treatment delivery information system. The system runs on any standard PC platform located on the WAN and supports remote data access over phone lines. The interface module directly improves the efficiency of the department by the reduction of redundant data entry. Conclusion: The introduction of a multi-system interface module for sharing common radiation therapy data has decreased the overall treatment planning times without adding complexity. The use of other emerging standards such as DICOM are also being investigated to provide additional support in the future. The concept of the interface module can be used to connect to any data system that supports open connectivity standards

An effective and reliable access control is crucial to a PDM system. This article has discussed the commonly used access control models, analyzed their advantages and disadvantages, and proposed a new Role and Object based access control model that suits the particular needs of a PDM system. The new model has been implemented in a commercial PDM system, which has demonstrated enhanced flexibility and convenience.

...) and employ them to identify asymmetric maritime threats in port and waterways. Each surface track is monitored by a compound multi-agent system that comprise of the several intent models, each containing a nested multi-agent system...

In this paper it is shown how informal and formal specification of behavioural requirements and scenarios for agents and multi-agent systems can be integrated within multi-agent system design. In particular, it is addressed how a compositional

The aim of this study was to test the safety and performance of the Symplicity™ multi-electrode radio-frequency renal denervation system which was designed to reduce procedure time during renal denervation. The multi-electrode radiofrequency renal denervation system feasibility study is a prospective, non-randomised, open label, feasibility study that enrolled 50 subjects with hypertension. The study utilises a new renal denervation catheter which contains an array of four electrodes mounted in a helical configuration at 90 degrees from each other to deliver radiofrequency energy simultaneously to all four renal artery quadrants for 60 seconds. The protocol specified one renal denervation treatment towards the distal end of each main renal artery with radiofrequency energy delivered for 60 seconds per treatment. Total treatment time for both renal arteries was two minutes. The 12-month change in office systolic blood pressure (SBP) and 24-hour SBP was -19.2±25.2 mmHg, prenal artery stenosis or hypertensive emergencies occurred. The Symplicity multi-electrode radiofrequency renal denervation system was associated with a significant reduction in SBP at 12 months and minimal complications whilst it also reduced procedure time. NCT01699529.

Full Text Available Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction.

In extant literature, deterioration dependence among components can be modelled as inherent dependence and induced dependence. We find that the two types of dependence may co-exist and interact with each other in one multi-component system. We refer to this phenomenon as fault propagation. In practice, a fault induced by the malfunction of a non-critical component may further propagate through the dependence amongst critical components. Such fault propagation scenario happens in industrial assets or systems (bridge deck, and heat exchanging system). In this paper, a multi-layered vector-valued continuous-time Markov chain is developed to capture the characteristics of fault propagation. To obtain the mathematical tractability, we derive a partitioning rule to aggregate states with the same characteristics while keeping the overall aging behaviour of the multi-component system. Although the detailed information of components is masked by aggregated states, lumpability is attainable with the partitioning rule. It means that the aggregated process is stochastically equivalent to the original one and retains the Markov property. We apply this model on a heat exchanging system in oil refinery company. The results show that fault propagation has a more significant impact on the system's lifetime comparing with inherent dependence and induced dependence. - Highlights: • We develop a vector value continuous-time Markov chain to model the meta-dependent characteristic of fault propagation. • A partitioning rule is derived to reduce the state space and attain lumpability. • The model is applied on analysing the impact of fault propagation in a heat exchanging system.

A high performance system has been assembled using standard web components to deliver database information to a large number of broadly distributed clients. The CDF Experiment at Fermilab is establishing processing centers around the world imposing a high demand on their database repository. For delivering read-only data, such as calibrations, trigger information, and run conditions data, we have abstracted the interface that clients use to retrieve data objects. A middle tier is deployed that translates client requests into database specific queries and returns the data to the client as XML datagrams. The database connection management, request translation, and data encoding are accomplished in servlets running under Tomcat. Squid Proxy caching layers are deployed near the Tomcat servers, as well as close to the clients, to significantly reduce the load on the database and provide a scalable deployment model. Details the system's construction and use are presented, including its architecture, design, interfaces, administration, performance measurements, and deployment plan

Full Text Available As the development of computer science and smart health-care technology, there is a trend for patients to enjoy medical care at home. Taking enormous users in the Smart Health-care System into consideration, access control is an important issue. Traditional access control models, discretionary access control, mandatory access control, and role-based access control, do not properly reflect the characteristics of Smart Health-care System. This paper proposes an advanced access control model for the medical health-care environment, task-role-based access control model, which overcomes the disadvantages of traditional access control models. The task-role-based access control (T-RBAC model introduces a task concept, dividing tasks into four categories. It also supports supervision role hierarchy. T-RBAC is a proper access control model for Smart Health-care System, and it improves the management of access rights. This paper also proposes an implementation of T-RBAC, a binary two-key-lock pair access control scheme using prime factorization.

We present and compare the performances of two many-core architectures: the Nvidia Kepler and the Intel MIC both in a single system and in cluster configuration for the simulation of spin systems. As a benchmark we consider the time required to update a single spin of the 3D Heisenberg spin glass model by using the Over-relaxation algorithm. We present data also for a traditional high-end multi-core architecture: the Intel Sandy Bridge. The results show that although on the two Intel architectures it is possible to use basically the same code, the performances of a Intel MIC change dramatically depending on (apparently) minor details. Another issue is that to obtain a reasonable scalability with the Intel Phi coprocessor (Phi is the coprocessor that implements the MIC architecture) in a cluster configuration it is necessary to use the so-called offload mode which reduces the performances of the single system. As to the GPU, the Kepler architecture offers a clear advantage with respect to the previous Fermi architecture maintaining exactly the same source code. Scalability of the multi-GPU implementation remains very good by using the CPU as a communication co-processor of the GPU. All source codes are provided for inspection and for double-checking the results.

FCA US LLC (formally known as Chrysler Group LLC, and hereinafter “Chrysler”) was awarded an American Recovery and Reinvestment Act (ARRA) funded project by the Department of Energy (DOE) titled “A MultiAir®/MultiFuel Approach to Enhancing Engine System Efficiency” (hereinafter “project”). This award was issued after Chrysler submitted a proposal for Funding Opportunity Announcement DE-FOA- 0000079, “Systems Level Technology Development, Integration, and Demonstration for Efficient Class 8 Trucks (SuperTruck) and Advanced Technology Powertrains for Light-Duty Vehicles (ATP-LD).” Chrysler started work on this project on June 01, 2010 and completed testing activities on August 30, 2014. Overall objectives of this project were; Demonstrate a 25% improvement in combined Federal Test Procedure (FTP) City and Highway fuel economy over a 2009 Chrysler minivan; Accelerate the development of highly efficient engine and powertrain systems for light-duty vehicles, while meeting future emissions standards; and Create and retain jobs in accordance with the American Recovery and Reinvestment Act of 2009

Remote systems, in which a human operator in a safe zone determines pertinent circumstances and makes decisions on work procedures, while a robot does direct work in hazardous environments, have been becoming more and more important in accordance with the increase in nuclear facilities. In such remote systems, to perform tasks which are merely ambiguously defined beforehand, it is very important that the systems have the ability to execute desired tasks easily and immediately without any programming or teaching work on the spot. A control system, named Self Approach System (SAS), for a multi-joint inspection robot has been developed as a key component in a remote inspection system for use in physically difficult or dangerous environments. It has 8 joints and 17 degrees-of-freedom and was designed taking many of the above points into account. This paper describes SAS details

Full Text Available Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the “Selective Attention Chip” (SAC, which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.

Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the "Selective Attention Chip" (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.

into areas where there is no access to maritime platforms. Sea-based interceptor platforms have the ability to intercept targets at each stage of the...argues that the most efficient concept for integrating active defense weapon systems is a multi- layered architecture with redundant intercept ...faster data transfer and will prevent data loss. The need for almost 100% interception successes is increasing as the threat becomes more

Full Text Available The construction of Smart Grids leads to the main question of what kind of intelligence such grids require and how to build it. Some authors choose an agent based solution to realize this intelligence. However, there may be some misunderstandings in the way this technology is being applied. This paper exposes some considerations of this subject, focusing on the Microgrid level, and shows a practical example through INGENIAS methodology, which is a methodology for the development of Agent Oriented systems that applies Model Driven Development techniques to produce fully functional Multi-Agent Systems.

The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.

This monograph introduces novel responses to the different problems that arise when multiple robots need to execute a task in cooperation, each robot in the team having a monocular camera as its primary input sensor. Its central proposition is that a consistent perception of the world is crucial for the good development of any multi-robot application. The text focuses on the high-level problem of cooperative perception by a multi-robot system: the idea that, depending on what each robot sees and its current situation, it will need to communicate these things to its fellows whenever possible to share what it has found and keep updated by them in its turn. However, in any realistic scenario, distributed solutions to this problem are not trivial and need to be addressed from as many angles as possible. Distributed Consensus with Visual Perception in Multi-Robot Systems covers a variety of related topics such as: · distributed consensus algorithms; · data association and robustne...

We investigate the design of anonymous voting protocols, CV-based binary-valued ballot and CV-based multi-valued ballot with continuous variables (CV) in a multi-dimensional quantum cryptosystem to ensure the security of voting procedure and data privacy. The quantum entangled states are employed in the continuous variable quantum system to carry the voting information and assist information transmission, which takes the advantage of the GHZ-like states in terms of improving the utilization of quantum states by decreasing the number of required quantum states. It provides a potential approach to achieve the efficient quantum anonymous voting with high transmission security, especially in large-scale votes. (paper)

In order to prevent radiation accident and its expansion, more integrated management system is required to safety management for radiation workers in the nuclear facilities. Therefore, JAEA (Japan Atomic Energy Agency) and HAM (Hitachi Aloka Medical, Ltd) have developed innovative real-time multi-function entry/exit management system which managed worker's exposed dose and position under the joint developed patent. This system is sharing worker's data among workers and server manager who is inside of or outside of building, such as worker's positing, health condition and exposed dose. It consists of mobile equipments, receivers, LAN, and servers system. This report summarizes the system to be installed in the JMTR. (author)

The ACCESS project addressed the development, testing, and demonstration of the proposed advanced technologies and the associated emission and fuel economy improvement at an engine dynamometer and on a full-scale vehicle. Improve fuel economy by 25% with minimum performance penalties Achieve SULEV level emissions with gasoline Demonstrate multi-mode combustion engine management system

Recently observed increases in intensities and frequencies of climate extremes (e.g., floods, dam failure, and overtopping of river banks) necessitate the development of effective disaster prevention and mitigation strategies. Hydrologic models can be useful tools in predicting such events at different spatial and temporal scales. However, accuracy and prediction capability of such models are often constrained by the availability of high-quality representative hydro-meteorological data (e.g., precipitation) that are required to calibrate and validate such models. Improved technologies and products such as the Multi-Radar Multi-Sensor (MRMS) system that allows gathering and transmission of vast meteorological data have been developed to provide such data needs. While the MRMS data are available with high spatial and temporal resolutions (1 km and 15 min, respectively), its accuracy in estimating precipitation is yet to be fully investigated. Therefore, the main objective of this study is to evaluate the performance of the MRMS system in effectively capturing precipitation over the Lower Colorado River, Texas using observations from a dense rain gauge network. In addition, effects of spatial and temporal aggregation scales on the performance of the MRMS system were evaluated. Point scale comparisons were made at 215 gauging locations using rain gauges and MRMS data from May 2015. Moreover, the effects of temporal and spatial data aggregation scales (30, 45, 60, 75, 90, 105, and 120 min) and (4 to 50 km), respectively on the performance of the MRMS system were tested. Overall, the MRMS system (at 15 min temporal resolution) captured precipitation reasonably well, with an average R2 value of 0.65 and RMSE of 0.5 mm. In addition, spatial and temporal data aggregations resulted in increases in R2 values. However, reduction in RMSE was achieved only with an increase in spatial aggregations.

The authors investigated the knowledge, attitudes, and healthcare experiences of Deaf women. Interviews with 45 deaf women who participated in focus groups in American Sign Language were translated, transcribed, and analyzed. Deaf women's understanding of women's health issues, knowledge of health vocabulary in both English and American Sign Language, common health concerns among Deaf women, and issues of access to information, including pathways and barriers, were examined. As a qualitative study, the results of this investigation are limited and should be viewed as exploratory. A lack of health knowledge was evident, including little understanding of the meaning or value of cancer screening, mammography, or Pap smears; purposes of prescribed medications, such as hormone replacement therapy (HRT); or necessity for other medical or surgical interventions. Negative experiences and avoidance or nonuse of health services were reported, largely due to the lack of a common language with healthcare providers. Insensitive behaviors were also described. Positive experiences and increased access to health information were reported with practitioners who used qualified interpreters. Providers who demonstrated minimal signing skills, a willingness to use paper and pen, and sensitivity to improving communication were appreciated. Deaf women have unique cultural and linguistic issues that affect healthcare experiences. Improved access to health information may be achieved with specialized resource materials, improved prevention and targeted intervention strategies, and self-advocacy skills development. Healthcare providers must be trained to become more effective communicators with Deaf patients and to use qualified interpreters to assure access to healthcare for Deaf women.

There needs to be a strategy for securing the privacy of patients when exchanging health records between various entities over the Internet. Despite the fact that health care providers such as Google Health and Microsoft Corp.'s Health Vault comply with the U.S Health Insurance Portability and Accountability Act (HIPAA), the privacy of patients is still at risk. Several encryption schemes and access control mechanisms have been suggested to protect the disclosure of a patient's health record especially from unauthorized entities. However, by implementing these approaches, data owners are not capable of controlling and protecting the disclosure of the individual sensitive attributes of their health records. This raises the need to adopt a secure mechanism to protect personal information against unauthorized disclosure. Therefore, we propose a new Fine-grained Access Control (FGAC) mechanism that is based on subkeys, which would allow a data owner to further control the access to his data at the column-level. We also propose a new mechanism to efficiently reduce the number of keys maintained by a data owner in cases when the users have different access privileges to different columns of the data being shared.

and children with SLE in Africa are potentially at high risk for poor outcomes based on race ... coloured people.*[6,9] Through a ... areas are least likely to have access to a private car, yet may not have emergency .... High tra c accident mortality.

The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).

Full Text Available In order to model multi-dimensions and multi-granularities oriented complex systems, this paper firstly proposes a kind of multi-relational Fuzzy Cognitive Map (FCM to simulate the multi-relational system and its auto construct algorithm integrating Nonlinear Hebbian Learning (NHL and Real Code Genetic Algorithm (RCGA. The multi-relational FCM fits to model the complex system with multi-dimensions and multi-granularities. The auto construct algorithm can learn the multi-relational FCM from multi-relational data resources to eliminate human intervention. The Multi-Relational Data Mining (MRDM algorithm integrates multi-instance oriented NHL and RCGA of FCM. NHL is extended to mine the causal relationships between coarse-granularity concept and its fined-granularity concepts driven by multi-instances in the multi-relational system. RCGA is used to establish high-quality high-level FCM driven by data. The multi-relational FCM and the integrating algorithm have been applied in complex system of Mutagenesis. The experiment demonstrates not only that they get better classification accuracy, but it also shows the causal relationships among the concepts of the system.

Surveillance with automated positioning and tracking of subjects and vehicles in 3D is desired in many defence and security applications. Camera systems with stereo or multiple cameras are often used for 3D positioning. In such systems, accurate camera calibration is needed to obtain a reliable 3D position estimate. There is also a need for automated camera calibration to facilitate fast deployment of semi-mobile multi-camera 3D positioning systems. In this paper we investigate a method for automatic calibration of the extrinsic camera parameters (relative camera pose and orientation) of a multi-camera positioning system. It is based on estimation of the essential matrix between each camera pair using the 5-point method for intrinsically calibrated cameras. The method is compared to a manual calibration method using real HD video data from a field trial with a multicamera positioning system. The method is also evaluated on simulated data from a stereo camera model. The results show that the reprojection error of the automated camera calibration method is close to or smaller than the error for the manual calibration method and that the automated calibration method can replace the manual calibration.

This paper studies the selective maintenance problem for multi-state systems with structural dependence. Each component can be in one of multiple working levels and several maintenance actions are possible to a component in a maintenance break. The components structurally form multiple hierarchical levels and dependence groups. A directed graph is used to represent the precedence relations of components in the system. A selective maintenance optimization model is developed to maximize the system reliability in the next mission under time and cost constraints. A backward search algorithm is used to determine the assembly sequence for a selective maintenance scenario. The maintenance model helps maintenance managers in determining the best combination of maintenance activities to maximize the probability of successfully completing the next mission. Examples showing the use of the proposed method are presented. - Highlights: • A selective maintenance model for multi-state systems is proposed considering both economic and structural dependence. • Structural dependence is modeled as precedence relationship when disassembling components for maintenance. • Resources for disassembly and maintenance are evaluated using a backward search algorithm. • Maintenance strategies with and without structural dependence are analyzed. • Ignoring structural dependence may lead to over-estimation of system reliability.

A Multi-Cell Electrical Energy System is a set of batteries that are connected in series. The series batteries provide the required voltage necessary for the contraption. After using the energy that is provided by the batteries, some cells within the system tend to have a lower voltage than the other cells. Also, other factors, such as the number of times a battery has been charged or discharged, how long it has been within the system and many other factors, result in some cells having a lesser capacity compared to the other cells within the system. The outcome is that it lowers the required capacity that the electrical energy system is required to provide. By having an unknown cell capacity within the system, it is unknown how much of a charge can be provided to the system so that the cells are not overcharged or undercharged. Therefore, it is necessary to know the cells capacity within the system. Hence, if we were dealing with a single cell, the capacity could be obtained by a full charge and discharge of the cell. In a series system that contains multiple cells a full charging or discharging cannot happen as it might result in deteriorating the structure of some cells within the system. Hence, to find the capacity of a single cell within an electrical energy system it is required to obtain a method that can estimate the value of each cell within the electrical energy system. To approach this method an electrical energy system is required. The electrical energy system consists of rechargeable non-equal capacity batteries to provide the required energy to the system, a battery management system (BMS) board to monitor the cells voltages, an Arduino board that provides the required communication to BMS board, and the PC, and a software that is able to deliver the required data obtained from the Arduino board to the PC. The outcome, estimating the capacity of a cell within a multi-cell system, can be used in many battery related technologies to obtain unknown

Video content has increased much on the Internet during last years. In spite of the efforts of different organizations and governments to increase the accessibility of websites, most multimedia content on the Internet is not accessible. This paper describes a system that contributes to make multimedia content more accessible on the Web, by automatically translating subtitles in oral language to Sign Writing, a way of writing Sign Language. This system extends the functionality of a general we...

Innovation is widely linked to cognitive ability, brain size, and adaptation to novel conditions. However, successful innovation appears to be influenced by both cognitive factors, such as inhibitory control, and non-cognitive behavioral traits. We used a multi-access box (MAB) paradigm to measure repeated innovation, the number of unique innovations learned across trials, by 10 captive spotted hyenas (Crocuta crocuta). Spotted hyenas are highly innovative in captivity and also display striking variation in behavioral traits, making them good model organisms for examining the relationship between innovation and other behavioral traits. We measured persistence, motor diversity, motivation, activity, efficiency, inhibitory control, and neophobia demonstrated by hyenas while interacting with the MAB. We also independently assessed inhibitory control with a detour cylinder task. Most hyenas were able to solve the MAB at least once, but only four hyenas satisfied learning criteria for all four possible solutions. Interestingly, neither measure of inhibitory control predicted repeated innovation. Instead, repeated innovation was predicted by a proactive syndrome of behavioral traits that included high persistence, high motor diversity, high activity and low neophobia. Our results suggest that this proactive behavioral syndrome may be more important than inhibitory control for successful innovation with the MAB by members of this species.

Multiple-input multiple-output (MIMO) techniques are becoming commonplace in recent wireless communication standards. This newly introduced dimension (i.e., space) can be efficiently used to mitigate the interference in the multi-user MIMO context. In this paper, we focus on the uplink of a MIMO multiple access channel (MAC) where perfect channel state information (CSI) is only available at the destination. We provide new sufficient conditions for a wide range of space-time block codes (STBC)s to achieve full-diversity under partial interference cancellation group decoding (PICGD) with or without successive interference cancellation (SIC) for completely blind users. Interference cancellation (IC) schemes for two and three users are then provided and shown to satisfy the full-diversity criteria. Beside the complexity reduction due to the fact that PICGD enables separate decoding of distinct users without sacrificing the diversity gain, further reduction of the decoding complexity may be obtained. In fact, thanks to the structure of the proposed schemes, the real and imaginary parts of each user\\'s symbols may be decoupled without any loss of performance. Our new IC scheme is shown to outperform recently proposed two-user IC scheme especially for high spectral efficiency while requiring significantly less decoding complexity.

Multiple-input multiple-output (MIMO) techniques are becoming commonplace in recent wireless communication standards. This newly introduced dimension (i.e., space) can be efficiently used to mitigate the interference in the multi-user MIMO context. In this paper, we focus on the uplink of a MIMO multiple access channel (MAC) where perfect channel state information (CSI) is only available at the destination. We provide new sufficient conditions for a wide range of space-time block codes (STBC)s to achieve full-diversity under partial interference cancellation group decoding (PICGD) with or without successive interference cancellation (SIC) for completely blind users. Interference cancellation (IC) schemes for two and three users are then provided and shown to satisfy the full-diversity criteria. Beside the complexity reduction due to the fact that PICGD enables separate decoding of distinct users without sacrificing the diversity gain, further reduction of the decoding complexity may be obtained. In fact, thanks to the structure of the proposed schemes, the real and imaginary parts of each user's symbols may be decoupled without any loss of performance. Our new IC scheme is shown to outperform recently proposed two-user IC scheme especially for high spectral efficiency while requiring significantly less decoding complexity.

The Precipitation Data and Information Services Center (PDISC) (http://disc.gsfc.nasa.gov/precipitation or google: NASA PDISC), located at the NASA Goddard Space Flight Center (GSFC) Earth Sciences (GES) Data and Information Services Center (DISC), is home of the Tropical Rainfall Measuring Mission (TRMM) data archive. For over 15 years, the GES DISC has served not only TRMM, but also other space-based, airborne-based, field campaign and ground-based precipitation data products to the precipitation community and other disciplinary communities as well. The TRMM Multi-Satellite Precipitation Analysis (TMPA) products are the most popular products in the TRMM product family in terms of data download and access through Mirador, the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) and other services. The next generation of TMPA, the Integrated Multi-satellitE Retrievals for GPM (IMERG) to be released in 2014 after the launch of GPM, will be significantly improved in terms of spatial and temporal resolutions. To better serve the user community, we are preparing data services and samples are listed below. To enable scientific exploration of Earth science data products without going through complicated and often time consuming processes, such as data downloading, data processing, etc., the GES DISC has developed Giovanni in consultation with members of the user community, requesting quick search, subset, analysis and display capabilities for their specific data of interest. For example, the TRMM Online Visualization and Analysis System (TOVAS, http://disc2.nascom.nasa.gov/Giovanni/tovas/) has proven extremely popular, especially as additional datasets have been added upon request. Giovanni will continue to evolve to accommodate GPM data and the multi-sensor data inter-comparisons that will be sure to follow. Additional PDISC tool and service capabilities being adapted for GPM data include: An on-line PDISC Portal (includes user guide, etc

This thesis develops a design method for social systems that do not fit the conventional industrial pattern and that consequently are not apt for regulation through mechanical means. It builds upon Soft Systems Methodology (SSM), one of the most widely used and well regarded of design methodologies. Yet, the systems science literature has identified some weaknesses in this methodology, and these have been confirmed in the critical evaluation and the empirical study of this thesis. It was foun...

Purpose: To build an infrastructure that enables radiologists on-call and external users a teleradiological access to the HTML-based image distribution system inside the hospital via internet. In addition, no investment costs should arise on the user side and the image data should be sent renamed using cryptographic techniques. Materials and Methods: A pure HTML-based system manages the image distribution inside the hospital, with an open source project extending this system through a secure gateway outside the firewall of the hospital. The gateway handles the communication between the external users and the HTML server within the network of the hospital. A second firewall is installed between the gateway and the external users and builds up a virtual private network (VPN). A connection between the gateway and the external user is only acknowledged if the computers involved authenticate each other via certificates and the external users authenticate via a multi-stage password system. All data are transferred encrypted. External users get only access to images that have been renamed to a pseudonym by means of automated processing before. Results: With an ADSL internet access, external users achieve an image load frequency of 0.4 CT images per second. More than 90% of the delay during image transfer results from security checks within the firewalls. Data passing the gateway induce no measurable delay. (orig.)

Full Text Available Today, when the emphasis on single-species production systems that is cardinal to agricultural and forestry programs the world over has resulted in serious ecosystem imbalances, the virtues of the time-tested practice of growing different species together as in managed Multi-strata Tree + Crop (MTC systems deserve serious attention. The coconut-palm-based multispecies systems in tropical homegardens and shaded perennial systems are just two such systems. A fundamental ecological principle of these systems is niche complementarity, which implies that systems that are structurally and functionally more complex than crop- or tree monocultures result in greater efficiency of resource (nutrients, light, and water capture and utilization. Others include spatial and temporal heterogeneity, perennialism, and structural and functional diversity. Unexplored or under-exploited areas of benefits of MTC systems include their ecosystem services such as carbon storage, climate regulation, and biodiversity conservation. These multispecies integrated systems indeed represent an agroecological marvel, the principles of which could be utilized in the design of sustainable as well as productive agroecosystems. Environmental and ecological specificity of MTC systems, however, is a unique feature that restricts their comparison with other land-use systems and extrapolation of the management features used in one location to another.

This report summarizes technological features of advanced telerobotic systems for reactor dismantling application developed at the Japan Atomic Energy Research Institute. Taking into consideration the special environmental conditions in reactor dismantling, major effort was made to develop multifunctional telerobotic system of high reliability which can be used to perform various complex tasks in an unstructured environment and operated in an easy and flexible manner. The system development was carried out through constructing three systems in seccession; a light-duty and a heavy-duty system as a prototype system for engineering test in cold environment, and a demonstration system for practical on-site application to dismantling highly radioactive reactor internals of an experimental boiling water reactor JPDR (Japan Power Demonstration Reactor). Each system was equipped with one or two amphibious manipulators which can be operated in either a push-button manual, a bilateral master-slave, a teach-and-playback or a programmed control mode. Different scheme was adopted in each system at designing the manipulator, transporter and man-machine interface so as to compare their advantages and disadvantages. According to the JPDR decommissioning program, the demonstration system was successfully operated to dismantle a portion of the radioactive reactor internals of the JPDR, which used underwater plasma arc cutting method and proved the usefulness of the multi-functional telerobotic system for reducing the occupational hazards and enhancing the work efficiency in the course of dismantling highly radioactive reactor components. (author)

Full Text Available In this paper, we propose a collision avoidance algorithm for multi-vehicle systems, which is a common problem in many areas, including navigation and robotics. In dynamic environments, vehicles may become involved in potential collisions with each other, particularly when the vehicle density is high and the direction of travel is unrestricted. Cooperatively planning vehicle movement can effectively reduce and fairly distribute the detour inconvenience before subsequently returning vehicles to their intended paths. We present a novel method of cooperative path planning for multi-vehicle systems based on reinforcement learning to address this problem as a decision process. A dynamic system is described as a multi-dimensional space formed by vectors as states to represent all participating vehicles’ position and orientation, whilst considering the kinematic constraints of the vehicles. Actions are defined for the system to transit from one state to another. In order to select appropriate actions whilst satisfying the constraints of path smoothness, constant speed and complying with a minimum distance between vehicles, an approximate value function is iteratively developed to indicate the desirability of every state-action pair from the continuous state space and action space. The proposed scheme comprises two phases. The convergence of the value function takes place in the former learning phase, and it is then used as a path planning guideline in the subsequent action phase. This paper summarizes the concept and methodologies used to implement this online cooperative collision avoidance algorithm and presents results and analysis regarding how this cooperative scheme improves upon two baseline schemes where vehicles make movement decisions independently.

The control system of Shanghai Synchrotron Radiation Facility (SSRF) is a large-scale distributed real-time control system, It involves many types and large amounts of real-time data access during the operating. Database system has wide application prospects in the large-scale accelerator control system. It is the future development direction of the accelerator control system, to replace the differently dedicated data structures with the mature standardized database system. This article discusses the application feasibility of database system in accelerators based on the database interface technology, real-time data access testing, and system optimization research and to establish the foundation of the wide scale application of database system in the SSRF accelerator control system. Based on the database interface technology, real-time data access testing and system optimization research, this article will introduce the application feasibility of database system in accelerators, and lay the foundation of database system application in the SSRF accelerator control system. (authors)

Several Combined Heat and Power (CHP) system options have been considered for evaluation with respect to the end-user requirements. These included Internal Combustion Engines (Otto and Diesel), Gas Turbines, Steam Turbines and Combined Cycles covering a wide range of electrical output. Data have been obtained from literature and the CHP systems have been evaluated using different criteria such as overall efficiency, investment cost, fuel cost, electricity cost, heat cost, CO 2 production and footprint. A multi-criteria method is used with an agglomeration function based on the statistical evaluation of weight factors. The technical, economic and social aspects of each system have been evaluated in an integrated manner and the results have been compared by means of the Sustainability Index. Based on the above criteria and depending on the user requirements, the best CHP system options have been established

Full Text Available Increased industrialization and new markets have led to an accumulation of used technical consumer goods, which results in greater exploitation of raw materials, energy and landfill sites. In order to reduce the use of natural resources conserve precious energy and limit the increase in waste volume. The application of disassembly techniques is the first step towards this prevention of waste. These techniques form a reliable and clean approach: "noble" or high-graded recycling. This paper presents a multi agent system for disassembly process, which is implemented in a computer-aided application for supervising of the disassembling system: the Interactive Intelligent Interface for Disassembling System. Unified modeling language diagrams are used for an internal and external definition of the disassembling system.

This paper illustrates an approach to generate multi-wing attractors in coupled Lorenz systems. In particular, novel four-wing (eight-wing) hyperchaotic attractors are generated by coupling two (three) identical Lorenz systems. The paper shows that the equilibria of the proposed systems have certain symmetries with respect to specific coordinate planes and the eigenvalues of the associated Jacobian matrices exhibit the property of similarity. In analogy with the original Lorenz system, where the two-wings of the butterfly attractor are located around the two equilibria with the unstable pair of complex-conjugate eigenvalues, this paper shows that the four-wings (eight-wings) of these attractors are located around the four (eight) equilibria with two (three) pairs of unstable complex-conjugate eigenvalues.

Probabilistic methods involving the use of multi-parameter Monte Carlo analysis can be applied to a wide range of engineering systems. The output from the Monte Carlo analysis is a probabilistic estimate of the system consequence, which can vary spatially and temporally. Sensitivity analysis aims to examine how the output consequence is influenced by the input parameter values. Sensitivity analysis provides the necessary information so that the engineering properties of the system can be optimized. This report details a package of sensitivity analysis techniques that together form an integrated methodology for the sensitivity analysis of probabilistic systems. The techniques have known confidence limits and can be applied to a wide range of engineering problems. The sensitivity analysis methodology is illustrated by performing the sensitivity analysis of the MCROC rock microcracking model

The Insurance Institute for Highway Safety (IIHS) of the United States of America in their reports has mentioned that a significant amount of the road mishaps would be preventable if more automated active safety applications are adopted into the vehicle. This includes the incorporation of collision avoidance system. The autonomous intervention by the active steering and braking systems in the hazardous scenario can aid the driver in mitigating the collisions. In this work, a real-time platform of a multi-actuators vehicle collision avoidance system is developed. It is a continuous research scheme to develop a fully autonomous vehicle in Malaysia. The vehicle is a modular platform which can be utilized for different research purposes and is denominated as Intelligent Drive Project (iDrive). The vehicle collision avoidance proposed design is validated in a controlled environment, where the coupled longitudinal and lateral motion control system is expected to provide desired braking and steering actuation in the occurrence of a frontal static obstacle. Results indicate the ability of the platform to yield multi-actuators collision avoidance navigation in the hazardous scenario, thus avoiding the obstacle. The findings of this work are beneficial for the development of a more complex and nonlinear real-time collision avoidance work in the future.

Full text: In 1998, access control system for the large helical device (LHD) experimental hall was constructed and put into operation at the National Institute for Fusion Science (NIFS) in Toki, Japan. Since then, the system has been continuously improved. It now controls access into the LHD controlled area through four entrances. The system has five turnstile gates and enables control of access at the four entrances. The system is always checking whether the shielding doors are open or closed at eight positions. The details pertaining to the construction of the system were reported at IRPA-10 held in Hiroshima, Japan, in 2000. Based on our construction experience of the NIFS access control system, we will discuss problems related to software and operational design of the system. We will also discuss some concerns regarding the use of the system in radiation facilities. The problems we will present concern, among other thing, individual registration, time control, turnstile control, interlock signal control, data aggregation and transactions, automatic and manual control, and emergency procedures. For example, in relation to the time control and turnstile control functions, we will discuss the gate-opening time interval for an access event, the timing of access data recording, date changing, turn bar control, double access, and access error handling. (author)

Multi-core computing provides new challenges to software engineering. The paper addresses such issues in the general setting of polytopol computing, that takes multi-core problems in such widely differing areas as ambient intelligence sensor networks and cloud computing into account. It argues that the essence lies in a suitable allocation of free moving tasks. Where hardware is ubiquitous and pervasive, the network is virtualized into a connection of software snippets judiciously injected to such hardware that a system function looks as one again. The concept of polytopol computing provides a further formalization in terms of the partitioning of labor between collector and sensor nodes. Collectors provide functions such as a knowledge integrator, awareness collector, situation displayer/reporter, communicator of clues and an inquiry-interface provider. Sensors provide functions such as anomaly detection (only communicating singularities, not continuous observation), they are generally powered or self-powered, amorphous (not on a grid) with generation-and-attrition, field re-programmable, and sensor plug-and-play-able. Together the collector and the sensor are part of the skeleton injector mechanism, added to every node, and give the network the ability to organize itself into some of many topologies. Finally we will discuss a number of applications and indicate how a multi-core architecture supports the security aspects of the skeleton injector.

Superconducting cables for power transmission usually contain two conductors for DC application, or three conductors for AC, with high voltage insulation. In contrast, for some applications related to accelerators it is convenient to transfer high currents via superconducting links feeding a number of circuits at relatively low voltage, of the order of a kilovolt, over distances of up to a few hundred meters. For power transmission applications based on cooling via sub-cooled liquid nitrogen, suitable HTS conductors are only available in the form of tape, and a multi-layer variant can be envisaged for the multi-circuit links. However, where cooling to temperatures of the order of 20 K is feasible, MgB2 conductor, available in the form of both tape and wire, can also be envisaged and in the latter case used to assemble round cables. There are, therefore, two distinct topologies - based on the use of wires or tapes - that can be envisaged for use in applications to multi-circuit link systems. In this paper the ...

Highlights: •A conceptual architectural model for a vertical maintenance DEMO is presented. •Novel concepts for a set of DEMO remote handling equipment are put forward. •Remote maintenance of a multi module segment blanket is found to be feasible. •The criticality of space in the vertical port is highlighted. -- Abstract: The anticipated high neutron flux, and the consequent damage to plasma-facing components in DEMO, results in the need to regularly replace the tritium breeding and radiation shielding blanket. The current European multi module segment (MMS) blanket concept favours a less invasive small port entry maintenance system over large sector transport concepts, because of the reduced impact on other tokamak systems – particularly the magnetic coils. This paper presents a novel conceptual remote maintenance strategy for a Vertical Maintenance Scheme DEMO, incorporating substantiated designs for an in-vessel mover, to detach and attach the blanket segments, and cask-housed vertical maintenance devices to open and close access ports, cut and join service connections, and extract blanket segments from the vessel. In addition, a conceptual architectural model for DEMO was generated to capture functional and spatial interfaces between the remote maintenance equipment and other systems. Areas of further study are identified in order to comprehensively establish the feasibility of the proposed maintenance system

With the Space Transportation System (STS), the advent of space station Columbus and the development of expertise at working in space that this will entail, the gateway is open to the final frontier. The exploration of this frontier is possible with state-of-the-art hydrogen/oxygen propulsion but would be greatly enhanced by the higher specific impulse of electric propulsion. This paper presents a concept that uses a multi-megawatt nuclear power plant to drive an electric propulsion system. The concept has been named PEGASUS, PowEr GenerAting System for Use in Space, and is intended as a ''work horse'' for general space transportation needs, both long- and short-haul missions. The recent efforts of the SP-100 program indicate that a power system capable of producing upwards of 1 megawatt of electric power should be available in the next decade. Additionally, efforts in other areas indicate that a power system with a constant power capability an order of magnitude greater could be available near the turn of the century. With the advances expected in megawatt-class space power systems, the high specific impulse propulsion systems must be reconsidered as potential propulsion systems. The power system is capable of meeting both the propulsion system and spacecraft power requirements

... Prohibition on Circumvention of Copyright Protection Systems for Access Control Technologies AGENCY: Copyright... nonsubstantial correction to its regulation announcing the prohibition against circumvention of technological... the final rule governing exemption to prohibition on circumvention of copyright protection systems for...

This paper describes a human engineering effort in the design of a major security system upgrade at Lawrence Livermore National Laboratory. This upgrade was to be accomplished by replacing obsolete and difficult-to-man (i.e., multiple operator task actions required) security equipment and systems with a new, automated, computer-based access control system. The initial task was to assist the electronic and mechanical engineering staff in designing a computerized security accesssystem too functionally and ergonomically accommodate 100% of the Laboratory user population. The new computerized accesssystem was intended to control entry into sensitive exclusion areas by requiring personnel to use an entry booth-based system and/or a remote access control panel system. The primary user interface with the system was through a control panel containing a magnetic card reader, function buttons, LCD display, and push-button keypad

The laser-one hazard detector system, used on the Rensselaer Mars rover, is reviewed briefly with respect to the hardware subsystems, the operation, and the results obtained. A multidetector scanning system was designed to improve on the original system. Interactive support software was designed and programmed to implement real time control of the rover or platform with the elevation scanning mast. The formats of both the raw data and the post-run data files were selected. In addition, the interface requirements were selected and some initial hardware-software testing was completed.

What makes teamwork tick?. Cooperation matters, in daily life and in complex applications. After all, many tasks need more than a single agent to be effectively performed. Therefore, teamwork rules!. Teams are social groups of agents dedicated to the fulfilment of particular persistent tasks. In modern multiagent environments, heterogeneous teams often consist of autonomous software agents, various types of robots and human beings. Teamwork in Multi-agent Systems: A Formal Approach explains teamwork rules in terms of agents' attitudes and their complex interplay. It provides the first comprehe

In this paper, we present an agent-based solution of metalearning problem which focuses on optimization of data mining processes. We exploit the framework of computational multi-agent systems in which various meta-learning problems have been already studied, e.g. parameter-space search or simple method recommendation. In this paper, we examine the effect of data preprocessing for machine learning problems. We perform the set of experiments in the search-space of data mining processes which is...

Full Text Available The economy, which has become more information intensive, more global and more technologically dependent, is undergoing dramatic changes. The role of logistics is also becoming more and more important. In logistics, the objective of service providers is to fulfill all customers? demands while adapting to the dynamic changes of logistics networks so as to achieve a higher degree of customer satisfaction and therefore a higher return on investment. In order to provide high quality service, knowledge and information sharing among departments becomes a must in this fast changing market environment. In particular, artificial intelligence (AI technologies have achieved significant attention for enhancing the agility of supply chain management, as well as logistics operations. In this research, a multi-artificial intelligence system, named Integrated Intelligent Logistics System (IILS is proposed. The objective of IILS is to provide quality logistics solutions to achieve high levels of service performance in the logistics industry. The new feature of this agile intelligence system is characterized by the incorporation of intelligence modules through the capabilities of the case-based reasoning, multi-agent, fuzzy logic and artificial neural networks, achieving the optimization of the performance of organizations.

A miniaturised sensor system for aviation hydraulic fluids is presented. The system consists of an optochemical sensor and a particle sensor. The optochemical sensor detects the form of the O-H absorption feature around 3500 cm{sup -1} to reveal the water and acid contamination in the fluid. The particle sensor uses a light barrier principle to derive its particle contamination number. (orig.)

This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

Full Text Available Recently, the hospital information system environment using IT communication technology and utilization of medical information has been increasing. In the medical field, the medical information system only supports the transfer of patient information to medical staff through an electronic health record, without information about patient status. Hence, it needs a method of real-time monitoring for the patient. Also, in this environment, a secure method in approaching healthcare through various smart devices is required. Therefore, in this paper, in order to classify the status of the patients, we propose a dynamic approach of the medical information system in a hospital information environment using the dynamic access control method. Also, we applied the symmetric method of AES (Advanced Encryption Standard. This was the best encryption algorithm for sending and receiving biological information. We can define usefulness as the dynamic access application service based on the final result of the proposed system. The proposed system is expected to provide a new solution for a convenient medical information system.

...) through both DoD and non-DoD providers. This network of ET&D providers and users constitutes a multi-organizational, multi-stakeholder system that is at present loosely coordinated and incompletely understood...

A timely guide using iterative learning control (ILC) as a solution for multi-agent systems (MAS) challenges, this book showcases recent advances and industrially relevant applications. Readers are first given a comprehensive overview of the intersection between ILC and MAS, then introduced to a range of topics that include both basic and advanced theoretical discussions, rigorous mathematics, engineering practice, and both linear and nonlinear systems. Through systematic discussion of network theory and intelligent control, the authors explore future research possibilities, develop new tools, and provide numerous applications such as power grids, communication and sensor networks, intelligent transportation systems, and formation control. Readers will gain a roadmap of the latest advances in the fields and can use their newfound knowledge to design their own algorithms.

Regulatory agencies are imposing limits and constraints to protect the operator and/or the environment. While generally necessary, these controls also tend to increase cost and decrease efficiency and productivity. Intelligent computer systems can be made to perform these hazardous tasks with greater efficiency and precision without danger to the operators. The Idaho national Engineering and Environmental Laboratory and the Center for Self-Organizing and Intelligent Systems at Utah State University have developed a series of autonomous all-terrain multi-agent systems capable of performing automated tasks within hazardous environments. This paper discusses the development and application of cooperative small-scale and large-scale robots for use in various activities associated with radiologically contaminated areas, prescription farming, and unexploded ordinances

Full Text Available A strategy is described that utilizes a novel application of a potential-force function that includes the tuning of coefficients to control mobile robots orchestrated as a distributed multiagent system. Control system parameters are manipulated methodically via simulation and hardware experimentation to gain a better understanding of their impact upon mission performance of the multi-agent system as applied to a predetermined task of area exploration and mapping. Also included are descriptions of experiment infrastructure components that afford convenient solutions to research challenges. These consist of a surrogate localization (position and orientation function utilizing a novel MATLAB executable (MEX function and a user datagram protocol (UDP-based communications protocol that facilitates communication among network-based control computers.

A positron annihilation lifetime spectrometer employing a multi-parameter acquisition system has been prepared for various purposes such as the investigation and characterization of solid-state materials. The fast-fast coincidence technique was used in the present spectrometer with a pair of plastic scintillation detectors. The acquisition system is based on the Kmax software and on CAMAC modules. The data are acquired in event-by-event list mode. The time spectrum for the desired energy windows can be obtained by off-line data sorting and analysis. The spectrometer for event-by-event data acquisition is an important step to construct a positron age-momentum correlation (AMOC) spectrometer. The AMOC technique is especially suited for the observation of positron transitions between different states during their lifetime. The system performance was tested and the results were presented and discussed

A multi-stage automated target recognition (ATR) system has been designed to perform computer vision tasks with adequate proficiency in mimicking human vision. The system is able to detect, identify, and track targets of interest. Potential regions of interest (ROIs) are first identified by the detection stage using an Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter combined with a wavelet transform. False positives are then eliminated by the verification stage using feature extraction methods in conjunction with neural networks. Feature extraction transforms the ROIs using filtering and binning algorithms to create feature vectors. A feedforward back-propagation neural network (NN) is then trained to classify each feature vector and to remove false positives. The system parameter optimizations process has been developed to adapt to various targets and datasets. The objective was to design an efficient computer vision system that can learn to detect multiple targets in large images with unknown backgrounds. Because the target size is small relative to the image size in this problem, there are many regions of the image that could potentially contain the target. A cursory analysis of every region can be computationally efficient, but may yield too many false positives. On the other hand, a detailed analysis of every region can yield better results, but may be computationally inefficient. The multi-stage ATR system was designed to achieve an optimal balance between accuracy and computational efficiency by incorporating both models. The detection stage first identifies potential ROIs where the target may be present by performing a fast Fourier domain OT-MACH filter-based correlation. Because threshold for this stage is chosen with the goal of detecting all true positives, a number of false positives are also detected as ROIs. The verification stage then transforms the regions of interest into feature space, and eliminates false positives using an

During the last decades, high-throughput techniques allowed for the extraction of a huge amount of data from biological systems, unveiling more of their underling complexity. Biological systems encompass a wide range of space and time scales, functioning according to flexible hierarchies of mechanisms making an intertwined and dynamic interplay of regulations. This becomes particularly evident in processes such as ontogenesis, where regulative assets change according to process context and timing, making structural phenotype and architectural complexities emerge from a single cell, through local interactions. The information collected from biological systems are naturally organized according to the functional levels composing the system itself. In systems biology, biological information often comes from overlapping but different scientific domains, each one having its own way of representing phenomena under study. That is, the different parts of the system to be modelled may be described with different formalisms. For a model to have improved accuracy and capability for making a good knowledge base, it is good to comprise different system levels, suitably handling the relative formalisms. Models which are both multi-level and hybrid satisfy both these requirements, making a very useful tool in computational systems biology. This paper reviews some of the main contributions in this field.

Modern high-tech equipment requires precise temperature control and effective cooling below the ambient temperature. Greater cooling efficiencies will allow equipment to be operated for longer periods without overheating, providing a greater return on investment and increased in availability of the equipment. This paper presents application of the Lz-transform method to importance assessment of aging multi-state water-cooling system used in one of Israeli hospitals. The water cooling system consists of 3 principal sub-systems: chillers, heat exchanger and pumps. The performance of the system and the sub-systems is measured by their produced cooling capacity. Heat exchanger is an aging component. Straightforward Markov method applied to solve this problem will require building of a system model with numerous numbers of states and solving a corresponding system of multiple differential equations. Lz-transform method, which is used for calculation of the system elements importance, drastically simplified the solution. Numerical example is presented to illustrate the described approach.

AFECS is a pure Java based software framework for designing and implementing distributed control systems. AFECS creates a control system environment as a collection of software agents behaving as finite state machines. These agents can represent real entities, such as hardware devices, software tasks, or control subsystems. A special control oriented ontology language (COOL), based on RDFS (Resource Definition Framework Schema) is provided for control system description as well as for agent communication. AFECS agents can be distributed over a variety of platforms. Agents communicate with their associated physical components using range of communication protocols, including tcl-DP, cMsg (publish-subscribe communication system developed at Jefferson Lab), SNMP (simple network management protocol), EPICS channel access protocol and JDBC.

AFECS is a pure Java based software framework for designing and implementing distributed control systems. AFECS creates a control system environment as a collection of software agents behaving as finite state machines. These agents can represent real entities, such as hardware devices, software tasks, or control subsystems. A special control oriented ontology language (COOL), based on RDFS (Resource Definition Framework Schema) is provided for control system description as well as for agent communication. AFECS agents can be distributed over a variety of platforms. Agents communicate with their associated physical components using range of communication protocols, including tcl-DP, cMsg (publish-subscribe communication system developed at Jefferson Lab), SNMP (simple network management protocol), EPICS channel access protocol and JDBC.

AFECS is a pure Java based software framework for designing and implementing distributed control systems. AFECS creates a control system environment as a collection of software agents behaving as finite state machines. These agents can represent real entities, such as hardware devices, software tasks, or control subsystems. A special control oriented ontology language (COOL), based on RDFS (Resource Definition Framework Schema) is provided for control system description as well as for agent communication. AFECS agents can be distributed over a variety of platforms. Agents communicate with their associated physical components using range of communication protocols, including tcl-DP, cMsg (publish-subscribe communication system developed at Jefferson Lab), SNMP (simple network management protocol), EPICS channel access protocol and JDBC

Highlights: • We present H1DS, a new RESTful web service for accessing fusion data. • We examine the scalability and extensibility of H1DS. • We present a fast and user friendly web browser client for the H1DS web service. • A summary relational database is presented as an application of the H1DS API. - Abstract: A new data accesssystem, H1DS, has been developed and deployed for the H-1 Heliac at the Australian Plasma Fusion Research Facility. The data system provides access to fusion data via a RESTful web service. With the URL acting as the API to the data system, H1DS provides a scalable and extensible framework which is intuitive to new users, and allows access from any internet connected device. The H1DS framework, originally designed to work with MDSplus, has a modular design which can be extended to provide access to alternative data storage systems.

Highlights: • We present H1DS, a new RESTful web service for accessing fusion data. • We examine the scalability and extensibility of H1DS. • We present a fast and user friendly web browser client for the H1DS web service. • A summary relational database is presented as an application of the H1DS API. - Abstract: A new data accesssystem, H1DS, has been developed and deployed for the H-1 Heliac at the Australian Plasma Fusion Research Facility. The data system provides access to fusion data via a RESTful web service. With the URL acting as the API to the data system, H1DS provides a scalable and extensible framework which is intuitive to new users, and allows access from any internet connected device. The H1DS framework, originally designed to work with MDSplus, has a modular design which can be extended to provide access to alternative data storage systems

Several satellite uplink and downlink accessing schemes for customer premises service are compared. Four conceptual system designs are presented: satellite-routed frequency division multiple access (FDMA), satellite-switched time division multiple access (TDMA), processor-routed TDMA, and frequency-routed TDMA, operating in the 30/20 GHz band. The designs are compared on the basis of estimated satellite weight, system capacity, power consumption, and cost. The systems are analyzed for fixed multibeam coverage of the continental United States. Analysis shows that the system capacity is limited by the available satellite resources and by the terminal size and cost.

Full Text Available Today's embedded systems have evolved into multipurpose devices moving towards an embedded multi-agent system (MAS infrastructure. With the involvement of MAS in embedded systems, one remaining issues is establishing communication between agents in low computational power and low memory embedded systems without present Embedded Operating System (EOS. One solution is the extension of an outdated Trivial File Transfer Protocol (TFTP. The main advantage of using TFTP in embedded systems is the easy implementation. However, the problem at hand is the overall lack of security mechanisms in TFTP. This paper proposes an extension to the existing TFTP in a form of added security mechanisms: STFTP. The authentication is proposed using Digest Access Authentication process whereas the data encryption can be performed by various cryptographic algorithms. The proposal is experimentally tested using two embedded systems based on micro-controller architecture. Communication is analyzed for authentication, data rate and transfer time versus various data encryption ciphers and files sizes. STFTP results in an expected drop in performance, which is in the range of similar encryption algorithms. The system could be improved by using embedded systems of higher computational power or by the use of hardware encryption modules.

A multi-channel data acquisition and processing system for moessbauer spectroscopy is described, which consists of an intelligent interface and a BC3-80 microcomputer. The system has eight data channels, each channel contains a counting circuit and a memory. A Z80-CPU is used as a main unit for control and access. The microcomputer is used for real-time displaying spectrum, saving the data to disk, printing data and data processing. The system is applicable to a high counting rate multi-wire proportional chamber. It can increase greatly the counting rate for measuring moessbauer spectrum. The signals of each wire in the chamber go through a corresponding amplifier and a differential discriminator and are recorded by a corresponding data channel, the data of each channel is added by the microcomputer. In addition, two channels can be used to measure an absorption and a scattering spectrum at the same time and the internal and the surface information of the sample are obtained simultaneously

The strategic and contemporary importance of e-governance has been recognized across the world. In India too, various ministries of Govt. of India and State Governments have taken e-governance initiatives to provide e-services to citizens and the business they serve. To achieve the mission objectives, and make such e-governance initiatives successful it would be necessary to improve the trust and confidence of the stakeholders. It is assumed that the delivery of government services will share the same public network information that is being used in the community at large. In particular, the Internet will be the principal means by which public access to government and government services will be achieved. To provide the security measures main aim is to identify user's access requirement for the stakeholders and then according to the models of Nath's approach. Based on this analysis, the Govt. can also make standards of security based on the e-governance models. Thus there will be less human errors and bias. This analysis leads to the security architecture of the specific G2C application.

Various non-invasive imaging systems produce increasing amount of diagnostic images day by day in digital format. The direct consequence of this tendency places electronic archives and image transfers in spotlight. Moreover, the digital image archives may support any other activities like simultaneous displaying of multi-modality images, telediagnostics, on-line consultation, construction of standard databases for dedicated organs by regional and/or country wide (e.g. myocardial scintigraphy, mammography, etc....) in order to obtain much more exact diagnosis as well as to support education and training. Our institute started similar research and developing activities few years ago, resulting the construction of our PACS systems -MEDISA LINUX Debian and eRAD ImageMedical TM LINUX Red Hat- together with the telecommunication part. Mass storage unit of PACS is based on hard drives connecting in RAID with l.2Tbyte capacity. The on-line telecommunication system consists of an ISDN Multi-Media System (MMS) and Internet based independent units. MMS was dedicated mainly for on-line teleconferencing and consultation by the simultaneously transferred morphological and functional images obtaining from the central archives by DICOM or any other allowable image formats. MMS has been created as a part and requirements of an EU research project - RETRANSPLANT -. The central archives -PACS- can be accessed by DICOM 3.0 protocol on Internet surface through well maintained and secure access rights. Displaying and post-processing of any retrieved images on individual workstations are supported by eRAD ImageMedical TM PracticeBuilder1-2-3 (Window based) image manager with its unique supports and services. The 'real engine' of PracticeBuilder is Ver.5.0 or newer Internet Explorer. The unique feature of PracticelBuilder1-2-3 is the extremely fast patient and image access from the archives even from very 'far distance' (through continents), due to the exceptional image communication

Parrots and corvids show outstanding innovative and flexible behaviour. In particular, kea and New Caledonian crows are often singled out as being exceptionally sophisticated in physical cognition, so that comparing them in this respect is particularly interesting. However, comparing cognitive mechanisms among species requires consideration of non-cognitive behavioural propensities and morphological characteristics evolved from different ancestry and adapted to fit different ecological niches. We used a novel experimental approach based on a Multi-Access-Box (MAB). Food could be extracted by four different techniques, two of them involving tools. Initially all four options were available to the subjects. Once they reached criterion for mastering one option, this task was blocked, until the subjects became proficient in another solution. The exploratory behaviour differed considerably. Only one (of six) kea and one (of five) NCC mastered all four options, including a first report of innovative stick tool use in kea. The crows were more efficient in using the stick tool, the kea the ball tool. The kea were haptically more explorative than the NCC, discovered two or three solutions within the first ten trials (against a mean of 0.75 discoveries by the crows) and switched more quickly to new solutions when the previous one was blocked. Differences in exploration technique, neophobia and object manipulation are likely to explain differential performance across the set of tasks. Our study further underlines the need to use a diversity of tasks when comparing cognitive traits between members of different species. Extension of a similar method to other taxa could help developing a comparative cognition research program.

Full Text Available Parrots and corvids show outstanding innovative and flexible behaviour. In particular, kea and New Caledonian crows are often singled out as being exceptionally sophisticated in physical cognition, so that comparing them in this respect is particularly interesting. However, comparing cognitive mechanisms among species requires consideration of non-cognitive behavioural propensities and morphological characteristics evolved from different ancestry and adapted to fit different ecological niches. We used a novel experimental approach based on a Multi-Access-Box (MAB. Food could be extracted by four different techniques, two of them involving tools. Initially all four options were available to the subjects. Once they reached criterion for mastering one option, this task was blocked, until the subjects became proficient in another solution. The exploratory behaviour differed considerably. Only one (of six kea and one (of five NCC mastered all four options, including a first report of innovative stick tool use in kea. The crows were more efficient in using the stick tool, the kea the ball tool. The kea were haptically more explorative than the NCC, discovered two or three solutions within the first ten trials (against a mean of 0.75 discoveries by the crows and switched more quickly to new solutions when the previous one was blocked. Differences in exploration technique, neophobia and object manipulation are likely to explain differential performance across the set of tasks. Our study further underlines the need to use a diversity of tasks when comparing cognitive traits between members of different species. Extension of a similar method to other taxa could help developing a comparative cognition research program.

The design of grid-connected photovoltaic wind generator system supplying a farmstead in Nebraska has been undertaken in this dissertation. The design process took into account competing criteria that motivate the use of different sources of energy for electric generation. The criteria considered were 'Financial', 'Environmental', and 'User/System compatibility'. A distance based multi-objective decision making methodology was developed to rank design alternatives. The method is based upon a precedence order imposed upon the design objectives and a distance metric describing the performance of each alternative. This methodology advances previous work by combining ambiguous information about the alternatives with a decision-maker imposed precedence order in the objectives. Design alternatives, defined by the photovoltaic array and wind generator installed capacities, were analyzed using the multi-objective decision making approach. The performance of the design alternatives was determined by simulating the system using hourly data for an electric load for a farmstead and hourly averages of solar irradiation, temperature and wind speed from eight wind-solar energy monitoring sites in Nebraska. The spatial variability of the solar energy resource within the region was assessed by determining semivariogram models to krige hourly and daily solar radiation data. No significant difference was found in the predicted performance of the system when using kriged solar radiation data, with the models generated vs. using actual data. The spatial variability of the combined wind and solar energy resources was included in the design analysis by using fuzzy numbers and arithmetic. The best alternative was dependent upon the precedence order assumed for the main criteria. Alternatives with no PV array or wind generator dominated when the 'Financial' criteria preceded the others. In contrast, alternatives with a nil component of PV array but a high wind generator component

Researchers at the Idaho National Engineering and Environmental Laboratory and Montana State University have undertaken development of MINERVA, a patient-centric, multi-modal, radiation treatment planning system. This system can be used for planning and analyzing several radiotherapy modalities, either singly or combined, using common modality independent image and geometry construction and dose reporting and guiding. It employs an integrated, lightweight plugin architecture to accommodate multi-modal treatment planning using standard interface components. The MINERVA design also facilitates the future integration of improved planning technologies. The code is being developed with the Java Virtual Machine for interoperability. A full computation path has been established for molecular targeted radiotherapy treatment planning, with the associated transport plugin developed by researchers at the Lawrence Livermore National Laboratory. Development of the neutron transport plugin module is proceeding rapidly, with completion expected later this year. Future development efforts will include development of deformable registration methods, improved segmentation methods for patient model definition, and three-dimensional visualization of the patient images, geometry, and dose data. Transport and source plugins will be created for additional treatment modalities, including brachytherapy, external beam proton radiotherapy, and the EGSnrc/BEAMnrc codes for external beam photon and electron radiotherapy.

Full Text Available Development of the software application that provides comfortable working environment of embedded software applications was always a difficult task to achieve. To reach this goal it was necessary to integrate all specific tools designed for that purpose. This paper describes Integrated Development Environment (IDE that was developed to meet all specific needs of a software development for the family of multi-core target platforms designed for a digital signal processing in Cirrus Logic Company. Eclipse platform and RCP (Rich Client Platform was used as a basis, because it provides an extensible plug-in system for customizing the development environment. CLIDE (Cirrus Logic Integrated Development Environment represent the epilog of that effort, reliable IDE used for development of embedded applications. Validation of the solution is accomplished thru 2641 J Unit tests that validate most of the CLIDE's functionalities. Developed IDE (CLIDE significantly increases a quality of a software development for multi-core systems and reduces time-to-market, thereby justifying development costs.

Advancing the field of physical activity (PA) monitoring requires the development of innovative multi-sensor measurement systems that are feasible in the free-living environment. The use of novel analytical techniques to combine and process these multiple sensor signals is equally important. This paper describes a novel multi-sensor 'integrated PA measurement system' (IMS), the lab-based methodology used to calibrate the IMS, techniques used to predict multiple variables from the sensor signals, and proposes design changes to improve the feasibility of deploying the IMS in the free-living environment. The IMS consists of hip and wrist acceleration sensors, two piezoelectric respiration sensors on the torso, and an ultraviolet radiation sensor to obtain contextual information (indoors versus outdoors) of PA. During lab-based calibration of the IMS, data were collected on participants performing a PA routine consisting of seven different ambulatory and free-living activities while wearing a portable metabolic unit (criterion measure) and the IMS. Data analyses on the first 50 adult participants are presented. These analyses were used to determine if the IMS can be used to predict the variables of interest. Finally, physical modifications for the IMS that could enhance the feasibility of free-living use are proposed and refinement of the prediction techniques is discussed

In this paper, we comprehensively analyze the impact of four wave mixing (FWM) on the performance of incoherent multi-wavelength optical code-division multiple-access (MW-OCDMA) systems. We also consider many other interferences and noises, including multiple access interference, optical beating interference, and receiver noise, in the analysis. From the numerical results, we can find the power ranges of different MW-OCDMA systems, in which the impact of FWM is dominant and consequently results in an increase in the bit-error rate of the systems. We also find that the impact of FWM becomes more severe when the frequency spacing is small and/or dispersion-shifted fiber is used. In addition, we quantitatively discuss the impact of FWM on the number of supportable users and power penalty in the MW-OCDMA systems. (c) 2010 Optical Society of America.

Geophysical Oracle Database Management System (GPODMS) that is residing on UNIX True 64 Compaq Alpha server. GPODMS is a stable Oracle database system for longterm storage and systematic management of geophysical data and information of various disciplines...

Full Text Available Nowadays, on a rise of cybersecurity incidents and a very complex IT&C environment, the national legal systems must adapt in order to properly address the new and modern forms of criminality in cyberspace. The illegal access to a computer system remains one of the most important cyber-related crimes due to its popularity but also from the perspective as being a door opened to computer data and sometimes a vehicle for other tech crimes. In the same time, the information society services slightly changed the IT paradigm and represent the new interface between users and systems. Is true that services rely on computer systems, but accessing services goes now beyond the simple accessing computer systems as commonly understood by most of the legislations. The article intends to explain other sides of the access related to computer systems and services, with the purpose to advance possible legal solutions to certain case scenarios.

Full Text Available Cellular manufacturing system design problems such as design framework, manufacturing cells layout and layout evaluation. The research objective is developing the framework to designing manufacturing cells with considering the organization and management aspects in shopfloor. In this research have compared the existing layout with proposed layout which applied the multi criteria approach. The proposed method is combining Analytical Hierarchy Process (AHP, Clustering and heuristic approach. The result has show that grouping with Single Linkage Clustering (SLC to be selected as manufacturing cells. The comparison of clustering weight is 0,567, 0,245 and 0,188 for SLC, Complete Linkage Clustering (CLC and Average Linkage Clustering (ALC, respectively. This result shows that generating layout by using grouping result from SLC. The evaluation result shows that types of manufacturing cells better than process layout which used the existing system.

Power quality requirements for different consumers and electric equipment are distinguished. Conventionally a common power quality standard is applied to the whole power grid inducing debate between several sides, including consumers, generation sites and technical commissions. Customized power...... quality standard settings for different consumers become a widely accepted solution. The main challenge is on the proper regulation of power quality in different areas. This paper considers a multi-bus microgrid system where the power quality in each has flexible and individual standard. Distributed...... generators are utilized to provide power quality regulation functions. An optimization method based strategy is proposed and implemented with the power converter control system. A general mathematical model is established which can be used in the optimization problem for evaluating the objective function...

Multi-channel bolometer system is designed and installed to observe the radiation profile on JFT-2M tokamak. Sensor head is made of Thinistor, which is a kind of semiconductor, because it has the advantage of higher sensitivity of about one order of magnitude than the conventional metal foil bolometer and is suitable for the profile measurement in which the signal from the plasma is relatively small. The response and cooling characteristics of the bolometer sensor are suitable for the condition of JFT-2M tokamak plasma. Low noise circuit of bridge and differentiator is developed to optimize the signal to noise ratio in the JFT-2M operating condition. With use of the bolometer system, the radiation profile in joule heating plasma as well as additional heating plasma especially in H-mode plasma is successfully observed. (author)

Fifth-generation (5G) wireless access network promises to support higher access data rate with more than 1,000 times capacity with respect to current long-term evolution (LTE) systems. New radio-access-technologies (RATs) based on higher carrier frequencies to millimeter-wave (MMW) radio-over-fiber, and carrier-aggregation (CA) using multi-band resources are intensively studied to support the high data rate access and effectively use of frequency resources in heterogeneous mobile network (Het-Net). In this paper, we investigate several enabling technologies for MMW RoF systems in 5G Het-Net. Efficient mobile fronthaul (MFH) solutions for 5G centralized radio access network (C-RAN) and beyond are proposed, analyzed and experimentally demonstrated based on the analog scheme. Digital predistortion based on memory polynomial for analog MFH linearization are presented with improved EVM performances and receiver sensitivity. We also propose and experimentally demonstrate a novel inter-/intra- RAT CA scheme for 5G Het- Net. The real-time standard 4G-LTE signal is carrier-aggregated with three broadband 60GHz MMW signals based on proposed optical-domain band-mapping method. RATs based on new waveforms have also been studied here to achieve higher spectral-efficiency (SE) in asynchronous environments. Full-duplex asynchronous quasi-gapless carrier aggregation scheme for MMW ROF inter-/intra-RAT based on the FBMC is also presented with 4G-LTE signals. Compared with OFDM-based signals with large guard-bands, FBMC achieves higher spectral-efficiency with better EVM performance at less received power and smaller guard-bands.

quality of the received education of this category of people. The benefits of using of the developed ontological model of smartsystem of distance learning for visually impaired people based on multifunctional agents are: complex approach, based on the use of various intellectual, cognitive and statistical methods; possibility of developing an individual trajectory of learning for visually impaired people including the psychophysiological features of perception information; distance access to the latest technological equipment for performing laboratory and practical works by visually impaired people in the shared laboratories in real time. The ontological model provides to analyze more deeply the numerous connections between agents and considers it in developing software for smart-system of distance learning for visually impaired people. Multi-agent approach provides multi functionality of system, stability to system errors, and optimization of computing resources.

Since April 2004 the EarthScope USArray seismic network has grown to over 400 broadband stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. Providing secure, yet open, access to real-time and archived data for a broad range of audiences is best served by a series of platform agnostic low-latency web-based applications. We present a framework of tools that interface between the world wide web and Boulder Real Time Technologies Antelope Environmental Monitoring System data acquisition and archival software. These tools provide audiences ranging from network operators and geoscience researchers, to funding agencies and the general public, with comprehensive information about the experiment. This ranges from network-wide to station-specific metadata, state-of-health metrics, event detection rates, archival data and dynamic report generation over a stations two year life span. Leveraging open source web-site development frameworks for both the server side (Perl, Python and PHP) and client-side (Flickr, Google Maps/Earth and jQuery) facilitates the development of a robust extensible architecture that can be tailored on a per-user basis, with rapid prototyping and development that adheres to web-standards.

Multi-tier aviary systems are becoming more common in organic egg production. The area on the tiers can be included in the net area available to the hens (also referred to as usable area) when calculating maximum indoor stocking densities in organic systems within the EU. In this article, results...... on egg production, laying behaviour and use of veranda and outdoor area are reported for organic laying hens housed in a multi-tier system with permanent access to a veranda and kept at stocking densities (D) of 6, 9 and 12 hens/m2 available floor area, with concomitant increases in the number of hens...... per trough, drinker, perch and nest space. In a fourth treatment, access to the top tier was blocked reducing vertical, trough and perch access at the lowest stocking density (treatment D6x). In all other aspects than stocking density, the experiment followed the EU regulations on the keeping...

The development in computer software and hardware technology and information processing as well as the accumulation in the design and feedback from Nuclear Power Plant (NPP) operation created a good opportunity to develop an integrated Operator Support System. The Real-time Multi-task Operator Support System (RMOSS) has been built to support the operator's decision making process during normal and abnormal operations. RMOSS consists of five system subtasks such as Data Collection and Validation Task (DCVT), Operation Monitoring Task (OMT), Fault Diagnostic Task (FDT), Operation Guideline Task (OGT) and Human Machine Interface Task (HMIT). RMOSS uses rule-based expert system and Artificial Neural Network (ANN). The rule-based expert system is used to identify the predefined events in static conditions and track the operation guideline through data processing. In dynamic status, Back-Propagation Neural Network is adopted for fault diagnosis, which is trained with the Genetic Algorithm. Embedded real-time operation system VxWorks and its integrated environment Tornado II are used as the RMOSS software cross-development. VxGUI is used to design HMI. All of the task programs are designed in C language. The task tests and function evaluation of RMOSS have been done in one real-time full scope simulator. Evaluation results show that each task of RMOSS is capable of accomplishing its functions. (authors)

The nitrogen compound concentration in water is increased by atmospheric-pressure plasma discharge treatment. A rod-to-water electrode discharge treatment system using plasma discharge has been developed by our group to obtain water with a high concentration of nitrogen compounds, and this plasma-treated water improves the growth of chrysanthemum roots. However, it is difficult to apply the system to the agriculture because the amount of treated water obtained by using the system too small. In this study, a multi-spark discharge system (MSDS) equipped multiple spark plugs is presented to obtain a large amount of plasma-treated water. The MSDS consisted of inexpensive parts in order to reduce the system introduction cost for agriculture. To suppress the temperature increase of the spark plugs, the 9 spark plugs were divided into 3 groups, which were discharged in order. The plasma-treated water with a NO3- concentration of 50 mg/L was prepared using the MSDS for 90 min, and the treatment efficiency was about 6 times higher than that of our previous system. It was confirmed that the NO2-, O3, and H2O2 concentrations in the water were also increased by treating the water using the MSDS.

Highlights: • Efficient multi-objective optimization algorithm F-YYPO demonstrated. • Three Stirling engine applications with a total of eight cases. • Improvements in the objective function values of up to 30%. • Superior to the popularly used gamultiobj of MATLAB. • F-YYPO has extremely low time complexity. - Abstract: In this work, we demonstrate the performance of Front-based Yin-Yang-Pair Optimization (F-YYPO) to solve multi-objective problems related to Stirling engine systems. The performance of F-YYPO is compared with that of (i) a recently proposed multi-objective optimization algorithm (Multi-Objective Grey Wolf Optimizer) and (ii) an algorithm popularly employed in literature due to its easy accessibility (MATLAB’s inbuilt multi-objective Genetic Algorithm function: gamultiobj). We consider three Stirling engine based optimization problems: (i) the solar-dish Stirling engine system which considers objectives of output power, thermal efficiency and rate of entropy generation; (ii) Stirling engine thermal model which considers the associated irreversibility of the cycle with objectives of output power, thermal efficiency and pressure drop; and finally (iii) an experimentally validated polytropic finite speed thermodynamics based Stirling engine model also with objectives of output power and pressure drop. We observe F-YYPO to be significantly more effective as compared to its competitors in solving the problems, while requiring only a fraction of the computational time required by the other algorithms.

The Occupational Information AccessSystem (OIAS) improves the accessibility of occupational labor market information for career planning. Its operation at Churchill High School is evaluated from several angels: the likes and dislikes of users; the effect of OIAS on users' knowledge of occupational information and on their career plans; why other…

Expanded Internet access to the Ohio Career Information System (OCIS) would provide adults in Ohio who need to or wish to make career changes with the best available information about occupations, education and training programs, and financial aid. In order to determine the feasibility of improving access without cost to users, an advisory group,…

A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.

Full Text Available Video content has increased much on the Internet during last years. In spite of the efforts of different organizations and governments to increase the accessibility of websites, most multimedia content on the Internet is not accessible. This paper describes a system that contributes to make multimedia content more accessible on the Web, by automatically translating subtitles in oral language to SignWriting, a way of writing Sign Language. This system extends the functionality of a general web platform that can provide accessible web content for different needs. This platform has a core component that automatically converts any web page to a web page compliant with level AA of WAI guidelines. Around this core component, different adapters complete the conversion according to the needs of specific users. One adapter is the Deaf People Accessibility Adapter, which provides accessible web content for the Deaf, based on SignWritting. Functionality of this adapter has been extended with the video subtitle translator system. A first prototype of this system has been tested through different methods including usability and accessibility tests and results show that this tool can enhance the accessibility of video content available on the Web for Deaf people.

Achieving 'universal access' to antiretroviral HIV treatment (ART) in lower income and transitional settings is a global target. Yet, access to ART is shaped by local social condition and is by no means universal. Qualitative studies are ideally suited to describing how access to ART is socially situated. We explored systemic barriers to accessing ART among people who inject drugs (PWID) in a Russian city (Ekaterinburg) with a large burden of HIV treatment demand. We undertook 42 in-depth qualitative interviews with people living with HIV with current or recent experience of injecting drug use. Accounts were analysed thematically, and supplemented here with an illustrative case study. Three core themes were identified: 'labyrinthine bureaucracy' governing access to ART; a 'system Catch 22' created by an expectation that access to ART was conditional upon treated drug use in a setting of limited drug treatment opportunity; and 'system verticalization', where a lack of integration across HIV, tuberculosis (TB) and drug treatment compromised access to ART. Taken together, we find that systemic factors play a key role in shaping access to ART with the potential adverse effects of reproducing treatment initiation delay and disengagement from treatment. We argue that meso-level systemic factors affecting access to ART for PWID interact with wider macro-level structural forces, including those related to drug treatment policy and the social marginalization of PWID. We note the urgent need for systemic and structural changes to improve access to ART for PWID in this setting, including to simplify bureaucratic procedures, foster integrated HIV, TB and drug treatment services, and advocate for drug treatment policy reform.

Full Text Available In conventional ultrasound imaging systems with phased arrays, the further improvement of lateral resolution requires enlarging of the number of array elements that in turn increases both, the complexity and the cost, of imaging systems. Multi-element synthetic aperture focusing (MSAF systems are a very good alternative to conventional systems with phased arrays. The benefit of the synthetic aperture is in reduction of the system complexity, cost and acquisition time. In a MSAF system considered in the paper, a group of elements transmit and receive signals simultaneously, and the transmit beam is defocused to emulate a single element response. The echo received at each element of a receive sub-aperture is recorded in the computer memory. The process of transmission/reception is repeated for all positions of a transmit sub-aperture. All the data recordings associated with each corresponding pair "transmit-receive sub-aperture" are then focused synthetically producing a low-resolution image. The final high-resolution image is formed by summing of the all low-resolution images associated with transmit/receive sub-apertures. A problem of parameter optimization of a MSAF system is considered in this paper. The quality of imaging (lateral resolution and contrast is expressed in terms of the beam characteristics - beam width and side lobe level. The comparison between the MSAF system described in the paper and an equivalent conventional phased array system shows that the MSAF system acquires images of equivalent quality much faster using only a small part of the power per image.

This study examines the effect of accessibility to urban jobs via a public transport system on individual earnings and commuting behaviour. The effect of improved public transport based accessibility on these outcomes is determined by exploiting the exogenous variation in access to a public rail ...... with a change in commuting patterns as the improved access to public transport facilitates a shift from employment within the township to better paid jobs in the city centre, as well as in other suburbs of the Copenhagen Metropolitan area...

In prostate cancer, the detection of metastatic lymph nodes indicates progression from localized disease to metastasized cancer. The detection of positive lymph nodes is, however, a complex and time consuming task for experienced radiologists. Assistance of a two-stage Computer-Aided Detection (CAD) system in MR Lymphography (MRL) is not yet feasible due to the large number of false positives in the first stage of the system. By introducing a multi-structure, multi-atlas segmentation, using an affine transformation followed by a B-spline transformation for registration, the organ location is given by a mean density probability map. The atlas segmentation is semi-automatically drawn with ITK-SNAP, using Active Contour Segmentation. Each anatomic structure is identified by a label number. Registration is performed using Elastix, using Mutual Information and an Adaptive Stochastic Gradient optimization. The dataset consists of the MRL scans of ten patients, with lymph nodes manually annotated in consensus by two expert readers. The feature map of the CAD system consists of the Multi-Atlas and various other features (e.g. Normalized Intensity and multi-scale Blobness). The voxel-based Gentleboost classifier is evaluated using ROC analysis with cross validation. We show in a set of 10 studies that adding multi-structure, multi-atlas anatomical structure likelihood features improves the quality of the lymph node voxel likelihood map. Multiple structure anatomy maps may thus make MRL CAD more feasible.

This paper proposes a multi-agent system for energy resource scheduling of an islanded power system with distributed resources, which consists of integrated microgrids and lumped loads. Distributed intelligent multi-agent technology is applied to make the power system more reliable, efficient and capable of exploiting and integrating alternative sources of energy. The algorithm behind the proposed energy resource scheduling has three stages. The first stage is to schedule each microgrid individually to satisfy its internal demand. The next stage involves finding the best possible bids for exporting power to the network and compete in a whole sale energy market. The final stage is to reschedule each microgrid individually to satisfy the total demand, which is the addition of internal demand and the demand from the results of the whole sale energy market simulation. The simulation results of a power system with distributed resources comprising three microgrids and five lumped loads show that the proposed multi-agent system allows efficient management of micro-sources with minimum operational cost. The case studies demonstrate that the system is successfully monitored, controlled and operated by means of the developed multi-agent system. (author)

This paper proposes a multi-agent system for energy resource scheduling of an islanded power system with distributed resources, which consists of integrated microgrids and lumped loads. Distributed intelligent multi-agent technology is applied to make the power system more reliable, efficient and capable of exploiting and integrating alternative sources of energy. The algorithm behind the proposed energy resource scheduling has three stages. The first stage is to schedule each microgrid individually to satisfy its internal demand. The next stage involves finding the best possible bids for exporting power to the network and compete in a whole sale energy market. The final stage is to reschedule each microgrid individually to satisfy the total demand, which is the addition of internal demand and the demand from the results of the whole sale energy market simulation. The simulation results of a power system with distributed resources comprising three microgrids and five lumped loads show that the proposed multi-agent system allows efficient management of micro-sources with minimum operational cost. The case studies demonstrate that the system is successfully monitored, controlled and operated by means of the developed multi-agent system. (author)

PAAMS, the International Conference on Practical Applications of Agents and Multi-Agent Systems is an evolution of the International Workshop on Practical Applications of Agents and Multi-Agent Systems. PAAMS is an international yearly tribune to present, to discuss, and to disseminate the latest developments and the most important outcomes related to real-world applications. It provides a unique opportunity to bring multi-disciplinary experts, academics and practitioners together to exchange their experience in the development of Agents and Multi-Agent Systems. This volume presents the papers that have been accepted for the 2014 special sessions: Agents Behaviours and Artificial Markets (ABAM), Agents and Mobile Devices (AM), Bio-Inspired and Multi-Agents Systems: Applications to Languages (BioMAS), Multi-Agent Systems and Ambient Intelligence (MASMAI), Self-Explaining Agents (SEA), Web Mining and Recommender systems (WebMiRes) and Intelligent Educational Systems (SSIES).

NASA earth science instruments are increasingly relying on airborne missions. However, traditionally, there has been limited common infrastructure support available to principal investigators in the area of science data systems. As a result, each investigator has been required to develop their own computing infrastructures for the science data system. Typically there is little software reuse and many projects lack sufficient resources to provide a robust infrastructure to capture, process, distribute and archive the observations acquired from airborne flights. At NASA's Jet Propulsion Laboratory (JPL), we have been developing a multi-mission data system infrastructure for airborne instruments called the Airborne Cloud Computing Environment (ACCE). ACCE encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation. This includes improving data system interoperability across each instrument. A principal characteristic is being able to provide an agile infrastructure that is architected to allow for a variety of configurations of the infrastructure from locally installed compute and storage services to provisioning those services via the "cloud" from cloud computer vendors such as Amazon.com. Investigators often have different needs that require a flexible configuration. The data system infrastructure is built on the Apache's Object Oriented Data Technology (OODT) suite of components which has been used for a number of spaceborne missions and provides a rich set of open source software components and services for constructing science processing and data management systems. In 2010, a partnership was formed between the ACCE team and the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) mission to support the data processing and data management needs

. The down-converted signal is re-modulated on to the lightwave and transmit further through the fiber-optic system. In the uplink, both up-and down-conversion are performed by electrical means. Furthermore, we investigate both passive and active wireless transmitters in this work for both downlink......We present a bidirectional wireless bridge in the W-band enabling the seamless convergence between the wireless and fiber-optic access networks. In the downlink, a 16 Gbit/s QPSK signal is photonically up-converted at the wireless transmitter and electrically down-converted at the wireless receiver...... and uplink transmissions. With an active wireless transmitter, up to 15 meters wireless transmission is successfully achieved with a BER below the 7% FEC limit in the downlink....

In modern embedded systems, the increasing number of cores requires efficient cache hierarchies to ensure data throughput, but such cache hierarchies are restricted by their tumid size and interference accesses which leads to both performance degradation and wasted energy. In this paper, we firstly propose a behavior-aware cache hierarchy (BACH) which can optimally allocate the multi-level cache resources to many cores and highly improved the efficiency of cache hierarchy, resulting in low energy consumption. The BACH takes full advantage of the explored application behaviors and runtime cache resource demands as the cache allocation bases, so that we can optimally configure the cache hierarchy to meet the runtime demand. The BACH was implemented on the GEM5 simulator. The experimental results show that energy consumption of a three-level cache hierarchy can be saved from 5.29% up to 27.94% compared with other key approaches while the performance of the multi-core system even has a slight improvement counting in hardware overhead.

Keywords: multi-machine power system stability, AVR system, power system stabilizer, PID controller ... The proposed controller was a fuzzy-logic-based stabilizer that has the capability to ..... Computer methods in power system analysis.

Multi-infeed HVDC (MIDC) system connected with VSC-HVDC links and LCC-HVDC links is a new structure in modern power systems, which can be called hybrid multi-infeed HVDC (HMIDC) system. The paper presents the voltage stability analysis of a HMIDC system modeled from a possible future Danish power...

A new computing system EDDYMULT based on the finite element circuit method has been developed to solve actual eddy current problems in a multi-torus system, which consists of many torus-conductors and various kinds of axisymmetric poloidal field coils. The EDDYMULT computing system can deal three-dimensionally with the modal decomposition of eddy current in a multi-torus system, the transient phenomena of eddy current distributions and the resultant magnetic field. Therefore, users can apply the computing system to the solution of the eddy current problems in a tokamak fusion device, such as the design of poloidal field coil power supplies, the mechanical stress design of the intensive electromagnetic loading on device components and the control analysis of plasma position. The present report gives a detailed description of the EDDYMULT system as an user's manual: 1) theory, 2) structure of the code system, 3) input description, 4) problem restrictions, 5) description of the subroutines, etc. (author)

Many typical chemical engineering operations are multi-fluid systems. They are carried out in distillation columns (vapor/liquid), liquid-liquid contactors (liquid/liquid) and other similar devices. An important parameter is interfacial area concentration, which determines the rate of interfluid heat, mass and momentum transfer and ultimately, the overall performance of the equipment. In many cases, the models for determining interfacial area concentration are empirical and can only describe the cases for which there is experimental data. In an effort to understand multiphase reactors and the mixing process better, a multi-fluid model has been developed as part of a research effort to calculate interfacial area transport in several different types of in-line static mixers. For this work, the ensemble-averaged property conservation equations have been derived for each fluid and for the mixture. These equations were then combined to derive a transport equation for the interfacial area concentration. The final, one-dimensional model was compared to interfacial area concentration data from two sizes of Kenics in-line mixer, two sizes of concurrent jet and a Tee mixer. In all cases, the calculated and experimental data compared well with the highest scatter being with the Tee mixer comparison.

Highlights: • Multi-objective optimization of three recent Stirling engine models. • Use of efficient crossover and mutation operators for real coded Genetic Algorithm. • Demonstrated supremacy of the strategy over the conventionally used algorithm. • Improvements of up to 29% in comparison to literature results. - Abstract: In this article we demonstrate the supremacy of the Non-dominated Sorting Genetic Algorithm-II with Simulated Binary Crossover and Polynomial Mutation operators for the multi-objective optimization of Stirling engine systems by providing three examples, viz., (i) finite time thermodynamic model, (ii) Stirling engine thermal model with associated irreversibility and (iii) polytropic finite speed based thermodynamics. The finite time thermodynamic model involves seven decision variables and consists of three objectives: output power, thermal efficiency and rate of entropy generation. In comparison to literature, it was observed that the used strategy provides a better Pareto front and leads to improvements of up to 29%. The performance is also evaluated on a Stirling engine thermal model which considers the associated irreversibility of the cycle and consists of three objectives involving eleven decision variables. The supremacy of the suggested strategy is also demonstrated on the experimentally validated polytropic finite speed thermodynamics based Stirling engine model for optimization involving two objectives and ten decision variables.

The Tank Waste Remediation System (TWRS) Multi-Year Work Plan (MYWP) documents the detailed total Program baseline and was constructed to guide Program execution. The TWRS MYWP is one of two elements that comprise the TWRS Program Management Plan. The TWRS MYWP fulfills the Hanford Site Management System requirement for a Multi-Year Program Plan and a Fiscal-Year Work Plan. The MYWP addresses program vision, mission, objectives, strategy, functions and requirements, risks, decisions, assumptions, constraints, structure, logic, schedule, resource requirements, and waste generation and disposition. Sections 1 through 6, Section 8, and the appendixes provide program-wide information. Section 7 includes a subsection for each of the nine program elements that comprise the TWRS Program. The foundation of any program baseline is base planning data (e.g., defendable product definition, logic, schedules, cost estimates, and bases of estimates). The TWRS Program continues to improve base data. As data improve, so will program element planning, integration between program elements, integration outside of the TWRS Program, and the overall quality of the TWRS MYWP. The MYWP establishes the TWRS baseline objectives to store, treat, and immobilize highly radioactive Hanford waste in an environmentally sound, safe, and cost-effective manner. The TWRS Program will complete the baseline mission in 2040 and will incur costs totalling approximately 40 billion dollars. The summary strategy is to meet the above objectives by using a robust systems engineering effort, placing the highest possible priority on safety and environmental protection; encouraging open-quotes out sourcingclose quotes of the work to the extent practical; and managing significant but limited resources to move toward final disposition of tank wastes, while openly communicating with all interested stakeholders