The handling of datasets at scientific user facilities is becoming progressively more challenging as advances in sources and detectors drive increasingly aggressive data acquisition rates. The ability to share and process these data depends on the ability of the originator of the data and the persons making use of that data to unequivocally agree on what these data mean. This understanding is driven by use of standardized data formats.

The Experimental Physics and Industrial Control System (EPICS) is a set of Open Source software tools, libraries and applications developed collaboratively and used worldwide. EPICS is used to create distributed soft real-time control systems for scientific instruments such as a particle accelerators, telescopes and other large scientific experiments.

EPICS is used extensively at the APS. The last comprehensive EPICS training at the APS was held ten years ago. Since then many things have changed, and many individuals have joined the organization. The AES Software Services Group is organizing an updated series of EPICS training classes to begin this September and continue through the winter of 2015. Many of the classes will have corresponding hands-on laboratory sessions.

EPICS Version 4 has been a formal international development project since 2011, with active members from 7 organizations spread across 4 US states and 3 European countries. This talk will discuss the main objectives and current state of the project, explain how and why the APS is involved, and suggest some advantages it could bring to the APS Accelerator and Beamline control systems. This is an encore presentation of the talk given on 26 November 2013.

29 January 2014 @ 11:30 AM

401 - B2100

EPICS Version 4 Development, Andrew Johnson (AES-SSG)

EPICS Version 4 has been a formal international development project since 2011, with active members from 7 organizations spread across 4 US states and 3 European countries. This talk will discuss the main objectives and current state of the project, explain how and why the APS is involved, and suggest some advantages it could bring to the APS Accelerator and Beamline control systems.

X-ray Computed Tomography (XCT) is a powerful technique for imaging 3D structures at the micro- and nano-levels. Recent upgrades to tomography beamlines at the APS have enabled imaging at resolutions up to 20nm at increased pixel counts and speeds. As detector resolution and speed increase, the amount of data that must be transferred and analyzed also increases. This coupled with growing experiment complexity drives the need for software to automate data acquisition and processing. We present an experiment control and data processing system for tomography beamlines that helps address this concern. The software, written in C++ using Qt, interfaces with EPICS for beamline control and provides live and offline data viewing, basic image manipulation features, and scan sequencing that coordinates EPICS enabled apparatus. Post acquisition, the software triggers a workflow pipeline, written using ActiveMQ, that transfers data from the detector computer to an analysis computer, and launches a reconstruction process. Experiment metadata and provenance information is stored along with raw and analyzed data in a single HDF5 file.

Low latency between data acquisition and analysis is of critical importance to any experiment. The combination of a faster parallel algorithm and a data pipeline for connecting disparate components (detectors, clusters, file formats) enabled us to greatly enhance the operational efficiency of the x-ray photon correlation spectroscopy experiment facility at the Advanced Photon Source. The improved workflow starts with raw data (120 MB/s) streaming directly from the detector camera, through an on-the-fly discriminator implemented in firmware to Hadoop’s distributed file system in a structured HDF5 data format. The user then triggers the MapReduce-based parallel analysis. For effective bookkeeping and data management, the provenance information and reduced results are added to the original HDF5 file. Finally, the data pipeline triggers user-specific software for visualizing the data. The whole process is completed shortly after data acquisition – a significant improvement of operation over the previous setup. The faster turnaround time helps scientists to make near real-time adjustments to the experiments.

The EPICS Version 4 development effort* is not planning to replace the current Version 3 IOC Database or its use of the Channel Access network protocol in the near future. Interoperability is a key aim of the V4 development, which is building upon the older IOC implementation. EPICS V3 continues to gain new features and functionality on its Version 3.15 development branch, while the Version 3.14 stable branch has been accumulating minor tweaks, bug fixes, and support for new and updated operating systems. This paper describes the main enhancements provided by recent and upcoming releases of EPICS Version 3 for control system applications.

The EPICS Collaboration Meeting this May was hosted jointly by the Diamond Light Source and the ISIS Spallation Neutron Source, and was held at their Harwell campus in Oxfordshire, England. Andrew Johnson will present the highlights from a selection of the talks given at that meeting, including an update on the current state of the EPICS V4 developments.

The great richness of data collected at the APS, CNM, and EMC plays critical roles in scientific exploration. As an example, imaging and microscopy experiments are adding dynamics and spectroscopic information to tomography. However, methods for understanding data have not kept pace; there is no “Moore’s law” scaling that applies to by-hand examination of data. Manual management and analysis of data is too time consuming and cumbersome for large, complex datasets. State-of-the-art mathematics and computer science tools will help automate the understanding process for large datasets. Only then will scientific understanding be able to fully benefit from the coming deluge of data. This workshop is organized to discuss the state of the art and future potential of advanced optimization, visualization, data management, and workflow techniques for Argonne’s user facilities. It brings together experts in optimization, computer vision, visualization and data management alongside user scientists to discuss current and future applications. The goal is to elaborate on how users can benefit from these techniques to enable new scientific discovery.

For the official workshop site, links to the talks, and the workshop report.