A prototype in neurally-aided 3D virtual reality sketch-to-object tree modeling in Tilt Brush. I was responsible for the paired data generation (10,000 samples) in Houdini and pre/post-processing, as well as the VR pipeline.

Abstract

Although realistic procedural tree generation is a well studied problem, particularly with the use of L-systems, it remains difficult for end users to manipulate procedural models in an easy, intuitive way. Interfaces for such systems often consist of a complicated set of rules and parameter values which do not intuitively map to the tree output. Conversely, virtual reality (VR) has opened the door to a wide array of intuitive 3D control and content authoring systems, but these systems often lack the complexity necessary for serious applications. We present a generative adversarial network (GAN) based method which, given a user sketch created with the VR drawing application Tilt Brush, generates a 3D tree model which conforms to the user sketch silhouette while maintaining a realistic tree structure, along with detail that extends on the user input with realistic tree-like features.

Pipeline: The willowGAN network maps a latent vector (encoded sketch) to a point cloud representation (vertices). Post processing generates optimal paths through the point cloud to generate a tree, and widths are added to the final tree to create an output tree mesh.