Capture | Compute | Create

March 2015

03/26/2015

Kansas City, Missouri is in the midst of constructing a new two-mile streetcar running through the city’s central business district. The line is envisioned as the trunk of a wider system of streetcar routes in the city and Reality Computing is playing an important role in the transit planning process for that expansion.

For several years, the city has engaged the public, neighborhood groups, local businesses, and other interested organizations in shaping the proposed expansion. To support this effort, the city and its project team—led by Kansas City architectural firm BNIM—is creating a 3D project model for early planning and visualization. The team is using Reality Computing and Autodesk software to collect and combine data sources of existing conditions within the expansion area, and merge that information with preliminary streetcar design data. Existing sources include point clouds from aerial scanning (LiDAR), utility and GIS data, city planning data for permitting, land values, economic impact, and so on.

Once the existing conditions and design data are combined into an aggregated model, the team will use that model to create a baseline for analysis, enabling the city to track and trend growth in the project area. The team will also be using model-based visualizations to show the projected build out for the corridor. As development proposals become available, 3D models of proposed buildings will be added to the project model to create renderings and animations that will help city officials and the public understand the impact of new development.

03/20/2015

On February 25-26, the Reality Solutions group hosted REAL2015—the first, one-of-its kind conference at Fort Mason, San Francisco, focusing on the increasing use of 3D to Capture (scan), Compute (design), and Create (fabricate). It was an inspiring event where speakers shared compelling stories about how they leverage 3D to create incredible products, artwork, designs, structures, exhibitions, and visualizations.

The kick off session, “REAL Stories,” featured several renowned, international speakers including: Eythor Bender of UNYQ talking about the design and fabrication of artistic custom prosthetic leg fairings; Stuart Brown describing how he used 3D to reproduce exotic classic cars; Tim Zaman of the Delft University using 3D scanning to reproduce famous master paintings – including works by Rembrandt; and Sarah Kenderdine of the University of New South Wales in Australia showing examples of interactive visualizations for museum exhibitions.

Over the next 2 days, 60 world-class speakers continued their tales of how they are using 3D to pave to way to the future. Tim Webber, the Chief Creative Officer for Framestore and this year’s Oscar winner for visual effects for the 2014 film Gravity, gave a mesmerizing talk on the making of Gravity.

REAL2015’s exhibitions featured many commercial hardware and software OEMs such as Leica, FARO, Topcon, Clearedge, and LFM. Other firms included Matterport, Floored, Occipital, and the augmented reality tool Metaio. One of the most popular exhibits was the large Shapify 3D scanning booth from Artec. Long lines formed for the opportunity to get a full-body scan. Oculus Rift also had people waiting to experience the wonder of VR technology where they explored a virtual ancient Chinese tomb. Another a big hit was the drone cage where vendors like DJI flew their latest and greatest drones. Autodesk’s own 3D printer, Ember, also made a showing.

The conference culminated on a high note as REAL2015 hosted the SFVR Meet-Up on the last night. Hundreds of people descended on Fort Mason to experience over a dozen different technology demo booths showcasing VR technology.

REAL2015 was an amazing and first-of-its-kind venue. It brought 60 world-class speakers, 50 Expo Demos, and over a 1000 attendees from 600 companies and institutions across the globe together to enlighten and excite us on how 3D is changing the way we capture, compute, and create.

All of the conference presentations were video recorded and will be available on the Reality Solutions REAL2015 video playlist soon.

Has Reality Computing technology been predicted by past authors or screenwriters? At least one author did: Cory Doctrow.

His novel Makers (released in 2009, but written in the aftermath of the dotcom crash) is a near-future imagining about two friends working from a garage that invent a cure for obesity, crowd-sourced theme parks, and…easy 3D printing! Check out this passage from the beginning of Part 1, that perfectly describes the capture, compute, and create aspects of Reality Computing.

In the passage, the inventor-entrepreneurs are demonstrating how they create some of their unique artwork to a technology reporter. First they use a 3D scanning machine to scan a Barbie Doll head (capture), then they drape a bitmap version of a Campbell’s Cream of Mushroom soup label (of all things!) over Barbie’s digital head (compute), and finally use a 3D printer to produce the finished artwork (create).

As an aside, you may enjoy reading the whole book, which one review describes as “a combination of business strategy, brilliant product ideas and laugh-out-loud moments of insight [that] will keep readers powering through this quick-moving tale.” The novel can be downloaded for free from the author's website (using a Creative Commons license agreement) and also has been published in traditional paper form.

03/04/2015

Most Reality Computing capture technologies are based on the concept of capturing surface data—whether it’s 3D laser scanning the outside of a building to inspect for structural damage, or using digital photogrammetry to produce accurate models of your feet for custom-sized shoes.

But what about technology for capturing volume data, such as scanners that can see inside or through a surface? Devices that can look into your body or see through walls. Check out the article descriptions below to learn about how Reality Computing based on volume capture technology is poised to break away from the realm of science fiction and superheroes.

3D Medical Imaging

3D medical imaging technology already exists and many of us have experienced it in the form of an ultrasound. But ultrasound equipment is costly and relatively bulky. Most machines include a probe that sends and receives the sound waves, which is hooked up to a computer that processes the ultrasound data and performs all the calculations, a keyboard to control the computer, and the screen to display the resulting image.

But entrepreneur Jonathan Rothberg has raised $100 million to create a portable, very inexpensive ultrasound imaging device. The system—being developed by Rothberg’s startup company, Butterfly Network—is apparently a very compact scanner with a screen display that can create 3D images in real time and perform preliminary diagnostics using cloud-based pattern-finding software. The patent application for the device says it lets you “look through what appears to be a window” into a person’s body. Sounds like equipment that could be found in the sickbay of the starship Enterprise…

Researchers at MIT’s Lincoln Lab are making strides in using radar technology to see through solid walls and display objects on the other side. The ‘x-ray vision’ technology gives you a real-time video feed of what's going on behind a concrete wall—which would be extremely useful for soldiers, police, or emergency rescue personnel. The system sends out radio waves, most of which are absorbed by the wall. But the few that get through bounce off objects on the other side and travel back through the wall. Again, the wall traps most of the returning waves, but enough get back. The system receiving the data calculates and displays only moving objects, so it ignores (for example) furniture. Right now, the objects are displayed as colorful blobs moving on the screen, giving you a bird's-eye-view perspective of what’s behind the wall. But as the technology progresses, the image clarity is sure to improve.

For more information on this new volume capture technology, check out these articles:

Boston Globe article about similar research by scientists at MIT and the University of Washington who are developing a “Wi-Vi” system that enables a Wi-Fi device to see through walls and detect motion.