339832018-10-10Application of Imaging Sensors for UAS Command and Control for Evolving Towards Autonomous Operations of UAS, Phase ICompletedJun 2015Dec 2015NextGen will undoubtedly include unmanned aircraft systems (UAS) as legislated under the Federal Aviation Administration Modernization and Reform Act of 2012. The FAA is currently developing the regulatory framework for safely integrating small UAS (sUAS) into routine national airspace System (NAS) operations. The introduction of UAS in the NAS offer advantages over manned aircraft for applications which can be hazardous to human pilots, are long in duration, require greater precision, and require rapid response. Startup UAS companies have proposed using UAS for remote sensing, disaster response, delivery of goods, agricultural support, and many other beneficial applications. One significant aspect in an efficient NAS is the development of autonomous capabilities for UAS and the technologies that supports the safe implementation UAS autonomy. AI proposes to support NASA's UAS autonomy effort by developing an imaging sensor based command and control system that takes advantage of the 3-axis accelerometers that are in smart devices prevalent in the consumer electronics market for autonomous UAS operations. The paradigm shift from human piloted system to an autonomously operated aircraft will require both successful development of sensor technology and image processing techniques to allow for the systematic translation in the human to machine interface. While the majority of current UAS operators are trained UAS pilots, the availability of these specialized pilots may not meet the demand that is anticipated for future commercial UAS companies. Additionally, more intuitive piloting controls are needed to enable wide spread adaption of this technology and the subsequent evolution to a more autonomous operation. The work described in this proposal will provide the foundation to enable development of a UAS with intuitive flight controlsPotential NASA Commercial Applications: The potential NASA commercial application exists in continued development of autonomy in UAS for the integration in NextGen. NASA researchers could use the resulting designs and/or prototypes of this research to extend their current work in the Aeronautics Research Mission Directorate to develop and evaluate the efficacy of newer, more human-centric autonomous control systems for UAS. Additionally, the identification and evaluation of operational functions will help NASA in developing a hierarchical taxonomy of autonomy of UAS functions. The results of the image processing techniques can also be used in manned aviation systems and potentially for interplanetary robotic missions where autonomy would provide more independent, but safer exploration of unknown environments.233324911Modeling, Simulation, Information Technology and Processing341311.4Information Processing395911.4.7Human-System Integration32394Robotics and Autonomous Systems32774.4Human-System Interaction35994.4.7Safety, Trust, and Interfacing of Robotic and Human Proximity OperationsSBIR/STTRSpace Technology Mission DirectorateLangley Research CenterLaRCNASA CenterHamptonVAAerospace Innovations, LLCIndustryYorktownVAVirginiaTherese GriebelCarlos TorrezDung Nguyen26742Briefing ChartImageApplication of Imaging Sensors for UAS Command and Control for Evolving Towards Autonomous Operations of UAS Briefing Chart22871https://techport.nasa.gov/file/2287110311430422Briefing Chart ImageImageApplication of Imaging Sensors for UAS Command and Control for Evolving Towards Autonomous Operations of UAS, Phase I22132https://techport.nasa.gov/file/2213284020