Linguistic Task Transfer for Humans and Cyber Systems

The introduction of cybernetic systems into physical environments is key to increased performance and repeatability of tasks ranging from car manufacturing to surgery. Automated machines have the capacity to reduce physical requirements on human labor and increase performance. Taking full advantage of these devices requires more than ad-hoc allocation between tools and tasks. Building on formal grammars, computational theory and experience in perception and control we investigate formal languages as a tool for task transfer between distinct cyber-physical systems such as humans and robots.

Formal languages are tools for encoding, describing and transferring structured knowledge. In natural language, the latter process is called communication. Similarly, we develop a formal lan- guage through which arbitrary cyber-physical systems communicate tasks via structured actions. This study focuses on the use of context-free grammars (CFG). In contrast to finite state machines, CFGs allow memory, model building and hierarchical task decomposition while maintaining effi- cient parsing, analysis and verification. The value of CFG adaptations in representing and analyzing tasks was independently discovered in Robot Control and Human Activity Recognition. Building on the grammar framework developed by the PI that merges perception and control: Motion Grammars, we unify these fields to enable entire task transfer between distinct systems.

The development of Motion Grammars to transfer knowledge between cyber-physical systems poses a number of research challenges: (1) What algorithms can be utilized to automatically repre- sent the task of a given CPS in terms of a language by observing the system behavior? (2) Given a system language that represents operation and a specification language of the task, how do we generate a system-consistent language that is formally guaranteed to execute the task? (3) Given languages that represent distinct components of a CPS such as its operation and its environment, can we integrate them into a single language for correct operation of the combined system?

We have introduced and evaluated preliminary solutions to the three questions posed above. First, we have developed an automated sequence of algorithms that observe a CPS performing a task and automatically generate a language that represents the task itself. In [1] we present a solution that first uses current optimization techniques to identify and track the behavior of a system, particularly as it interacts physically with environment objects. The sequence of identified events is progressively converted to a non-deterministic and then minimal deterministic finite automaton that represents the language of the task. We demonstrate the effectiveness of our approach by transferring planar assembly tasks from human demonstration to a simulated robot manipulator.

Second, in [2], we have designed a set of symbolic transformation rules, The Motion Grammar Calculus, that can be progressively applied to transform the language of a system in order to meet the task specification. Furthermore, in [3] we have utilized this calculus to couple distinct aspects of cyber-physical systems. Given a language that represents the entire system, we apply supervisory control to constrain its physical behavior to the desired task specification. This approach has been demonstrated by combining a language of semantic localization and mapping with a language of control for mobile manipulators to generate provably correct policies for an assistive robot perform- ing pick-and-place tasks in a new human environment.