Credit: (c) 2017 The University of Hong Kong, University College London, and Adobe Research

CHICAGO, IL, July 6, 2017--A new image-based method captures the complexities of thin structures, providing an innovative technique to reconstruct wiry objects digitally--just from a few input images. The novel computational method is poised for wide reach, by animators to depict the behavior of any wire- or cable-like object or by the medical practitioners to examine networks of thin structures, like a network of blood vessels.

Developed by computer scientists at University of Hong Kong, Adobe Research and UCL (University College London), "Image-based Reconstruction of Wire Art," will be one of several innovative computer graphics and interactive techniques that will be showcased at the annual SIGGRAPH conference, to be held in Los Angeles 30 July to 3 August.

Objects created by connecting and bending wires are common in furniture design, metal sculpting and jewelry making. Digitizing such wiry objects remains a challenging problem even though several depth- and image-based methods exist for general reconstruction. Unlike other objects, wire-based compositions are fundamentally different because they consist entirely of one-dimensional elements. A wire's structure makes it difficult to be recreated digitally, primarily because of its unique characteristics such as lack of features, thin elements and severe self-occlusions arising from crisscrossing connections.

"Thin structures could not be captured in 3D before," notes Niloy Mitra, coauthor of the research and professor of geometry processing at UCL. "The problem with existing off-the-shelf computational methods is that they return isolated set of points of the wiry object with no meaningful connectivity information."

The team addresses the connectivity of a single wire or cable from all of its angles and captures that composition in this new formulation. Mitra explains, "We observe that knowing how an object is made, helps to reconstruct it. So in computing wiry objects we aim to directly recover wires, rather than its isolated points."

The researchers were inspired by observing piles of tangled cable, climbing rope, to be exact. In order to recreate the physical object digitally, they modeled 3D curve parts in sections, preferring short connections that could be assembled together into their full cable (or wire) configuration. The team's method exploits unique characteristics of wiry objects--composed of only a few wires--and its smoothness--each wire is bent smoothly--to digitally reconstruct the entire 3D wire composition.

In the study, the researchers demonstrated their new method on objects with varying complexity, each composed of one to three wires; the object examples included wire sculptures of elephants, birds and flowers. Compared to existing techniques, their method shows reconstructions that are much higher in resolution and that more accurately capture both the 3D geometry and the topology of the wires.

Their method has the potential to benefit a number of fields utilizing thin structures - topology extraction for biological neural networks from neuroimages, for example. The new method could also be applied to medical imaging where reconstruction of thin structures from a few views is required.

###

The research was supported by the European Research Council (ERC) and the Research Grant Council of Hong Kong. The team is comprised of Lingjie Liu of University of Hong Kong and UCL; Duygu Ceylan of Adobe Research; Cheng Lin and Wenping Wang of University of Hong Kong; and Niloy J. Mitra of UCL.

The annual SIGGRAPH conference is a five-day interdisciplinary educational experience in the latest computer graphics and interactive techniques, including a three-day commercial exhibition that attracts hundreds of companies from around the world. The conference also hosts the international SIGGRAPH Computer Animation Festival, showcasing works from the world's most innovative and accomplished digital film and video creators. Juried and curated content includes outstanding achievements in time-based art, scientific visualization, visual effects, real-time graphics, and narrative shorts. SIGGRAPH 2017 will take place from 30 July-3 August 2017 in Los Angeles. Visit the SIGGRAPH 2017 website or follow SIGGRAPH on Facebook, Twitter, YouTube, or Instagram for more detailed information.

About ACM SIGGRAPH

The ACM Special Interest Group on Computer Graphics and Interactive Techniques is an interdisciplinary community interested in research, technology, and applications in computer graphics and interactive techniques. Members include researchers, developers, and users from the technical, academic, business, and art communities. ACM SIGGRAPH enriches the computer graphics and interactive techniques community year-round through its conferences, global network of professional and student chapters, publications, and educational activities.

About ACM

ACM, the Association for Computing Machinery, is the world's largest educational and scientific computing society, uniting educators, researchers, and professionals to inspire dialogue, share resources, and address the field's challenges. ACM strengthens the computing profession's collective voice through strong leadership, promotion of the highest standards, and recognition of technical excellence. ACM supports the professional growth of its members by providing opportunities for lifelong learning, career development, and professional networking.

About UCL (University College London)

UCL was founded in 1826. We were the first English university established after Oxford and Cambridge, the first to open up university education to those previously excluded from it, and the first to provide systematic teaching of law, architecture and medicine. We are among the world's top universities, as reflected by performance in a range of international rankings and tables. UCL currently has over 39,000 students from 150 countries and over 12,500 staff.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.