Formulated experiments to test different action recognition algorithms on APHill aerial videos provided by DARPA which were recorded using Electro-Optical/Infra-Red
(EO/IR) sensor from a military aircraft flying at a height of over 1000 meters. The videos are aligned and the moving objects are tracked using in-house system COCOA.

Integrated two action recognition algorithms "Bag of Visual Words" and "Lagrangian Particle Trajectories" into the VIRAT DARPA system developed by Lockheed Martin and
Kitware as part of VIRAT project. The algorithms were converted to C++ for integration.