Abstract:Methods for generating realistic animations and geometric models of trees—that is, describing how trees move and what trees look like—are desirable due to the ubiquitous presence of trees in nature, and hence in film, games, and other media. In animating a tree, it is important to give an artist tools for interactively parameterizing the tree's motion, but the constraint of maintaining interactive frame rates is difficult to satisfy due to the stiff joints and many degrees of freedom of a realistic tree. To this end, we present an algorithm that enables tree joints to be simulated approximately but analytically (and thus efficiently) for arbitrarily high stiffness values. The generation of tree geometry has been explored both through procedural methods and data-driven approaches, but geometric models with complexity and detail on the scale of real-world trees remain elusive. Our goal is to take advantage of the rich detail already present in a real-world tree by using data to reconstruct a tree "digital double." Using video data captured from a quadcopter drone, we use structure from motion to reconstruct the large scale pieces of the tree and stereo techniques to recover a 3D skeleton of finer branching structures. Our ultimate goal is to recover the geometry and motion of a real tree.

Bio:Ed Quigley is a fourth year Ph.D. student advised by Professor Ron Fedkiw. Ed is supported by the NDSEG fellowship, and his research interests include physically-based simulation, animation, and 3D reconstruction. Before coming to Stanford, he studied computer science at Grove City College.