Ghostly Guests

A joint venture between the digital-content companies Gribouille and Dimensional Media, Virtual Characters uses its proprietary technology to project computer-generated actors into free space without the use of virtual-reality headsets or special glasses. The actual technology used to project the computer-generated characters into space, he says, consists of high-end software and a "complicated array" of optical filters, beam splitters, mirrors, and lenses in various configurations.

Although the technology produces an image that appears holographic, the new display media differs from holography in that the light that reaches the viewer's eye is not reflected off of a flat surface, as it is with laser-generated holograms. In contrast, the unique optical/mirror-based system projects images into a viewing space from behind a wall. "With our technology, there are no lasers, thus none of the inevitable distortion that limits their applicability," says Nathan. "We project high-resolution computer-generated characters that users can see floating in space in front of them, without the use of special screens, head-mounted displays, or stereo glasses."

To project these "free-floating" images, the array of optical hardware is configured to collect light rays from a video source and then reassembles this data to project the aggregate rays to a point in free space.

A character appears to float in space, thanks to a new display technology that projects computer graphics data obtained by an array of optical hardware from a video source into a 3D view field.

The uniqueness of the medium impacts not only the ultimate display, but also the character-creation process, says Nathan. "We are always conscious that the character will be projected into free space, and therefore we're mindful to make use of the 3D aspects by having the virtual character act in ways that utilize the full dimensional scope." For instance, he says, the character can be on "idle loop" looking around the room and, when a person approaches, the virtual character can "look" directly at the newcomer, thanks to a motion-sensing system that makes the visitor's presence known.

The projection system itself consists of hardware with no moving parts. The dynamic digital content that is projected through the hardware can come from various sources. "If all we want is a customized one-minute presentation on a loop, then the content will come off of a DVD running through the optical array, which still allows us the flexibility to add motion sensors or to update the content by replacing the DVD," says Nathan.

A free-floating virtual character stops what he's "doing" in order to make eye contact with a newcomer, whose presence has been made known through motion sensing. The 3D apparition is visible to the newcomer without the use of special screens or heads

More flexibility is allowed, however, by incorporating computer-controls into the display process, Nathan says. "In essence, this allows us to update a virtual character from a remote location at any time, to network characters across the world, and to add such functionality as voice recognition and interactivity," all of which are on the company's agenda.

The developers' long-term objective with this technology is to realize the world's first "true vision" of virtual reality. "Our goal is to create a world where our advanced virtual characters will realistically and seamlessly interact with human beings," says Nathan. "We want to have unique personalized characters everywhere from the office to the living room. If we take this as our end point, the development challenges we face are everything between what we've already accomplished and that."

Among these challenges are improving the functionality of the hardware and the realism of the software content. "We are currently working on characters with broader emotional range, sophistication, and realism, so they will be more believable and engaging," says Nathan. The characters themselves are created using a range of commercially available modeling and rendering programs enhanced with proprietary add-ons. "Typically, we use a combination of off-the-shelf tools and proprietary software, depending on the needs of the character."

Nathan predicts the technology will be well suited to a broad range of applications, some of which are already being realized. These include advertising, information kiosks, fashion displays, location-based entertainment, retail, and, ultimately, real-time PC-based interactive experiences that totally immerse the consumer.

Characters that have been designed for projection into "free" space could ultimately be used as teachers or actors.

In addition, Nathan predicts, "with advancing bandwidth and computational power, we can eventually have virtual teachers educating children in school, virtual celebrity greeters at bars and restaurants, and virtual assistants in your home and office organizing your day and giving you the latest information."

These possibilities demonstrate the changing face of communications, says Nathan, "to a point where technology will communicate with people in a more human-like way."

Sponsored News

Headquartered in London, UK, The Foundry is well-known for its flagship products NUKE and MODO. But the portfolio doesn’t stop there. One of the newer members to start making waves in the industry is KATANA, their solution for high-efficiency look development and lighting.

Search for:

CGW is the only publication exclusively serving the CG industry for over 40 years. Each month we deliver cutting-edge technology used in the latest animation, Vfx, 3D, Game Development, Film, CAD, and Medical Industry.

Connect with us

Keep up with latest news

Email:

By clicking the "Subcribe" button, you agree to sign up for the CGW Magazine e-newsletter, as well as to receive third-party promotions from our partners.