Tag Archives: InterSense Inc.

Actors Christopher Shyer and Morena Baccarin on the greenscreen set of ABC’s V; the virtual set is overlaid.

Visual effects professionals refer to the chain of processes and technologies used to produce an effects shot as a “pipeline,” a term borrowed both from traditional manufacturing and from computer architecture.

In the past year, Zoic Studios has developed a unique pipeline product called ZEUS. The showiest of ZEUS’ capabilities is to allow filmmakers on a greenscreen set to view the real-time rendered virtual set during shooting; but ZEUS does far more than that.

Zoic Studios pipeline supervisor Mike Romey explains that the pipeline that would become ZEUS was originally developed for the ABC science fiction series V. “We realized working on the pilot that we needed to create a huge number of virtual sets. [Read this for a discussion of the program’s VFX and its virtual sets.] That led us to try to find different components we could assemble and bind together, that could give us a pipeline that would let us successfully manage the volume of virtual set work we were doing for V. And, while ZEUS is a pipeline that was built to support virtual sets for V, it also fulfills the needs of our studio at large, for every aspect of production.

“One of its components is the Lightcraft virtual set tracking system, which itself is a pipeline of different components. These include InterSense motion tracking, incorporating various specialized NVIDIA graphics cards for I/O out, as well as custom inertial sensors for rotary data for the camera.

“Out of the box, we liked the Lightcraft product the most. We proceeded to build a pipeline around it that could support it.

“Our studio uses a program called Shotgun, a general-purpose database system geared for project shot management, and we were able to tailor it to support the virtual set tracking technology. By coming up with custom tools, we were able to take the on-set data, use Shotgun as a means to manage it, then lean on Shotgun to retrieve the data for custom tools throughout our pipeline. When an artist needed to set up or lay out a scene, we built tools to query Shotgun for the current plate, the current composite that was done on set, the current asset, and the current tracking data; and align them all to the timecode based on editorial selects. Shotgun was where the data was all stored, but we used Autodesk Maya as the conduit for the 3D data – we were then able to make custom tools that transport all the layout scenes from Maya to The Foundry’s Nuke compositing software.”

By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot.

Romey explains the rationale behind creating 3D scenes in Nuke. “When when you look at these episodic shows, there’s a large volume of shots that are close-up, and a smaller percentage of establishing shots; so we could use Nuke’s compositing application to actually do our 3D rendering. In Maya we would be rendering a traditional raytrace pipeline; but for Nuke we could render a scanline pipeline, which didn’t have same overhead. Also, this would give the compositing team immediate access to the tools they need to composite the shot faster, and it let them be responsible for a lot of the close up shots. Then our 3D team would be responsible for the establishing shots, which we knew didn’t have the quality constraints necessary for a scanline render.

“By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot, because we didn’t have to provide the 3D support necessary. That’s how the ZEUS pipeline evolved, with that premise – how do we meet our client’s costs and exceed their visual expectations, without breaking the bank? Throughout the ZEUS pipeline, with everything that we did, we tried to find methodologies that would shave off time, increase quality, and return a better product to the client.

“One of the avenues we R&Ded to cut costs was the I/O time. We found that we were doing many shots that required multiple plates. A new component we looked at was a product that had just been released, called Ki Pro from AJA.

“When I heard about this product, I immediately contacted AJA and explained our pipeline. We have a lot of on-set data – we the have tracking data being acquired, the greenscreen, a composite, and the potential for the key being acquired. The problem is when we went back to production, the I/O time associated with managing all the different plates became astronomical.

“Instead of running a Panasonic D5 deck to record the footage, we could use the Ki Pro, which is essentially a tapeless deck, on-set to record directly to Apple ProRes codecs. The units were cost effective – they were about $4,000 per unit – so we could set up multiple units on stage, and trigger them to record, sync and build plates that all were the exact same length, which directly corresponded to our tracking data.”

We found methodologies that would shave off time, increase quality, and return a better product to the client.

Previously, the timecode would be lost when Editorial made their selects, and would have to be reestablished. “That became a very problematic process, which would take human intervention to do — there was a lot of possibility for human error. By introducing multiple Ki Pros into the pipeline, we could record each plate, and take that back home, make sure the layout was working, and then wait for the editorial select.” The timecode from the set was preserved.

“The ZEUS pipeline is really about a relationship of image sequence to timecode. Any time that relationship is broken, or becomes more convoluted or complicated to reestablish, it introduces more human error. By relieving the process of human error, we’re able to control our costs. We can offer this pipeline to clients who need the Apple ProRes 442 codec, and at the end of the day we can take the line item of I/O time and costs, and dramatically reduce it.”

Another important component is Python, the general-purpose high-level programming language. “Our pipeline is growing faster than we can train people to use it. The reason we were able to build the ZEUS pipeline the way we have, and build it out within a month’s time, is because we opted to use tools like Python. It has given us the ability to quickly and iteratively develop tools that respond proactively to production.

“One case in point – when we first started working with the tracking data for V, we quickly realized it didn’t meet our needs. We were using open source formats such as COLLADA, which are XML scene files that stored the timecode. We needed custom tools to trim, refine and ingest the COLLADA data into our Shotgun database, into the Maya cameras, into the Nuke preferences and Nuke scenes. Python gave us the ability to do that. It’s the glue that binds our studio.

“While most components in our pipeline are interchangeable, I would argue that Python is the one component that is irreplaceable. The ability to iteratively making changes on the fly during an episode could not have been deployed and developed using other tools. It would not have been as successful, and I think it would have taken a larger development team. We don’t have a year to do production, like Avatar – we have weeks. And we don’t have a team of developers, we have one or two.

While most components in our pipeline are interchangeable, Python is the one component that is irreplaceable.

“We’re kind of new to the pipeline game. We’ve only been doing a large amount of pipeline development for two years. What we’ve done is taken some rigid steps, to carve out our pipeline such a way that when we build a tool, it can be shared across the studio.”

Romey expects great things from ZEUS in the future. “We’re currently working on an entire episodic season using ZEUS. We’re working out the kinks. From time to time there are little issues and hiccups, but that’s traditional for developing and growing a pipeline. What we’ve found is that our studio is tackling more advanced technical topics – we’re doing things like motion capture and HDR on-set tracking. We’re making sure that we have a consistent and precise road map of how everything applies in our pipeline.

“With ZEUS, we’ve come up with new ways that motion capture pipelines can work. In the future we’d like to be able to provide our clients with a way not only to be on set and see what the virtual set looks like, while the director is working — but what if the director could be on set with the virtual set, with the actor in the motion capture suit, and see the actual CG character, all in context, in real-time, on stage? Multiple characters! What if we had background characters that were all creatures, and foreground characters that were people, interacting? Quite honestly, given the technology of Lightcraft and our ability to do strong depth-of-field, we could do CG characters close-to-final on stage. I think that’s where we’d like the ZEUS pipeline to go in the future.

“Similar pipelines have been done for other productions. But in my experience, a lot of times they are one-off pipelines. ZEUS is not a pipeline just for one show; it’s a pipeline for our studio.

“It’s cost effective, and we think can get the price point to meet the needs of all our clients, including clients with smaller budgets, like webisodes. The idea of doing an Avatar-like production for a webisode is a stretch; but if we build our pipeline in such a way that we can support it, we can find new clients, and provide them with a better product.

“Our main goal with ZEUS was to find ways to make that kind of pipeline economical, to make it grow and mature. We’ve treated every single component in the pipeline as a dependency that can be interchanged if it doesn’t meet our needs, and we’re willing to do so until we get the results that we need.”