Principal Market Segments

Principal Missions

We especially focus on missions that would not have existed with traditional manned aircraft. This includes:

Manned-Unmanned Teaming

Multi-ship Autonomous Collaborative Operations

Urban Air Mobility

Persistent Airborne Operations

Smart Automation

“Smart automation” is defined as automating appropriate elements of flight context-sensitive tasks to reduce workload peaks and even overall workload. Examples include automating checklists, using connectivity to create a “distributed
cockpit”,
extensive monitoring and data logging of system state, auto route planning based on vehicle state or contingency, and enabling reduced crew operations.

We view automation initially as a back stop to mitigate the human as a single point of failure but in the longer run, to serve as a transition from predominantly human-run operations to ultra reliable automation that enables full
autonomy.

Because system and mode complexity have increased and variation between aircraft and software versions in fielded legacy systems is increasing, we believe that “smart automation” should start with flight critical but deterministic tasks
(nominal
checklist usage, system monitoring), and as experience and confidence is increased, move on to non-deterministic tasks (flight/mission planning, contingency planning, decision support, self-preservation, reaction to imminent threats) in the
transition
from those human-run operations to ultra reliable automation.

Manned-Unmanned Teaming (MUM-T)

We have been building and flight testing systems that enable Manned-Unmanned Teaming since the company’s inception. We have built the on-board Mission Computer hardware and software and the off-board Control Station software for multiple
MUM-T
programs and have flight tested them to TRL-7. Our designs use a task-based approach (e.g. “Follow Him”, “Loiter”, “Fly Over That”, “RTB”, etc) and include a myriad of functionality that means the human supervisor or the manned part of the
manned-unmanned
team does not need to spend much cognitive bandwidth in controlling or directing the unmanned team members.

Multi-ship Collaborative Autonomous Operations

Our software has been specifically designed to support multi-ship collaborative autonomous operations. While multiple architectures are possible, to date, we’ve implemented an IP-network based approach such that every Mission Computer and
Control
Station is an entity in the network. Example functionality supported so far include multiple forms of x-y-z offset station keeping such as flying formation with another aircraft, maintaining a position or pattern referencing a
fixed-position
asset with aerial refueling next up in the queue. We are also moving to mesh network architectures that support n>>1 or swarming behaviors. The collaborative formation works as a distributed collective to find targets, reassign formation
roles
if required and share various forms of information.

Urban Air Mobility

We believe a new generation of Vertical Takeoff and Land (VTOL) vehicles will enable a new form of urban transportation using an Urban Air Mobility model, much like companies such as Uber and Lyft provide for urban surface transportation.
Towards
those ends, Autonodyne can provide:

4D Flight Management System (auto generate a route such that the user doesn’t need the expertise to define a proper route);

Mission Computers

We design and build mission computers (hardware and software) for defense and civil applications. These mission computers can manage a large selection of functions including vehicle navigation, vehicle health and status, on-board systems
control
like payloads, communications to/from other onboard systems like autopilots, and external comm links. We can host 3rd party software on our mission computer hardware and we can host our mission computer software on 3rd party hardware. We
are
currently in the process of designing the next generation of optimized UAS mission computers.

Control Stations

We design and build control station software for unmanned or non-traditionally piloted aircraft. We can apply our expertise in Natural User Interface (NUI) design which enables control of multiple dissimilar make/model vehicles at the
same time
by a single operator. Our control stations serve as a supervisory tool, can run on a host of different hardware platforms (e.g. mobile tablets, PCs, laptops, etc), can display all known entities in the network and is link agnostic. It
supports
multi-touch (e.g. pinch zoom), traditional keyboard/mouse, commercial gaming controls (e.g. Xbox), and voice/gesture inputs from augmented reality devices (e.g. Hololens).

One of our mobile tablet control stations directly controlled a formation of high-speed (0.95Mach) UAS vehicles from a crew station on an airborne platform in the summer of 2017.

(Click image to expand)

Augmented Reality (AR)

We are focused on applying commercially available augmented reality devices (e.g. Microsoft Hololens, Meta 2, smart eye-ware, etc) to flight operations. We have found AR can have a profound operational impact in areas such as 3D
holographic representations
of control station functionality, enabling remote maintenance of a flight vehicle, providing 3D overwatch functionality, providing innovative interactive swarm control and providing increased situational awareness in flight.