The technique is the work of Nokia senior researcher and MIT grad Manohar Srikanth, along with associate professor Kavita Bala of Cornell's Computer Science Department, and MIT's professor Frédo Durand. In their upcoming paper, "Computational Rim Illumination with Aerial Robots", they'll detail how their system -- based around a modified Parrot AR.Drone with LIDAR sensor, continuous halogen light source, and top-mounted flash strobe -- functions, but in the meantime you can get a nutshell view in the video below.

It is, perhaps, not the most practical of ideas with current quadrotor technology. Although your light source can move by itself as needed, saving you from tweaking the lighting between shots and letting you focus on your subject, it's also noisy, distracting, and moves a large volume of air that could itself prove problematic. And in its current incarnation, it requires physical tethers that connect camera, computer and drone, which limit the system's range.

Still, it's a pretty neat proof-of-concept, and with advances in drone technology or perhaps a ground-based variant of the system, it could one day allow you to set up your lighting once, and then have it tune itself to follow your shoot with a minimum of fuss. The next step for the system would be to take it off the tethers, put it in the real world, and let multiple drones cooperate together to provide for more complex lighting setups.