Recent Tweets

Download FREE Trial

RSS feeds in 3ds Max

Submit A Tip

Become a Fan

Thursday, 12 July 2012

Project Geppetto (v2) Update for 3dsMax 2013

What better way to start the day than with a free technology preview?

Todays is an important update to Project Geppetto (now version 2) that many of you have recently explored. The great news is that this technology continues to rapidly evolve based off your user experiences. (It’s currently receiving a rating of 4.7 out of 5 from users like yourself)

Recently Autodesk acquired the Evolver character generation technology along with certain employees. We have paired the Evolver technology with Project Geppetto to efficiently create large, randomly varied visual styles of Project Geppetto actors. Evolver uses a "virtual gene pool". The gene pool is a set of pre-designed body types, facial features and clothing components that are combined to generate the variety needed for a crowd. Facial features, including overall face, eyes, ears, mouth, chin, nose and cheeks, and physical attributes including overall body, arms, shoulders, chest, stomach, hips and legs, can be customized along with skin tone, eye colour and hair. Given additional facial, body and clothing data, the Evolver component can be customized and extended to generate crowds of any culture or style.

As always, your feedback is incredibly important to us at Autodesk, Here is how you can help us with this technology:

It’s also important to read the background info from the Autodesk team responsible for this preview tech.

“….Autodesk has been researching the underlying technology behind Project Geppetto for over five years. The technique is more sophisticated than simple blending techniques that result in the awkward and implausible motions used in video games. Project Geppetto is based on a fundamentally new approach to how motion data is processed and applied to characters. Motion data from key frames or motion capture clips are synthesized in such a way that variations of the original performances can be interactively applied with a high degree of quality. The process of working with the data is akin to training your characters with performance repertoires. We're calling this collection of motion that gets processed an "Ocean of Motion" to represent how different our approach is.

Importantly, the approach we've taken is not specific to human motions or to crowds. Given the right data, Project Geppetto could control dogs, snakes, dragons, cars, etc, and with a more directorial itinerary-based interface, Project Geppetto could be used to block in individual "hero" animation. Because the technology is data-driven its capabilities are limited only by the amount and kind of motion data to which it has access.

Project Geppetto technology can currently solve some of the following problems:

Agile responses: Real-time triggering of physically believable agile motions, such as quick turns. This is required for collision avoidance and navigation.

Object interaction: Seamless and natural real-time interaction with objects in the environment, such as sitting in chairs, or stepping up to climb stairs

Intelligent "human-like" dynamic obstacle avoidance: The perception of potential collisions and subsequent evasive actions must mimic the response times and behaviour of real human beings.

Intuitive crowd orchestration: New methods will be introduced that allow artists to directly control the flow and interaction of traffic patterns.

Ease of use plays a major role in the design of the intended workflow. Characters are orchestrated in an intuitive high-level fashion through the manipulation of flow patterns, goals, and designated behaviours. The tools are geared toward controls that are fun to use, and accessible to "non-animators," but not at the expense of serious artistic control.”

So how can I download this?

Visit the Autodesk Labs site by clicking here. The download link is at the top of the main page. If you’ve never used Project Geppetto before you should visit this “getting started” section.

Comments

Project Geppetto (v2) Update for 3dsMax 2013

What better way to start the day than with a free technology preview?

Todays is an important update to Project Geppetto (now version 2) that many of you have recently explored. The great news is that this technology continues to rapidly evolve based off your user experiences. (It’s currently receiving a rating of 4.7 out of 5 from users like yourself)

Recently Autodesk acquired the Evolver character generation technology along with certain employees. We have paired the Evolver technology with Project Geppetto to efficiently create large, randomly varied visual styles of Project Geppetto actors. Evolver uses a "virtual gene pool". The gene pool is a set of pre-designed body types, facial features and clothing components that are combined to generate the variety needed for a crowd. Facial features, including overall face, eyes, ears, mouth, chin, nose and cheeks, and physical attributes including overall body, arms, shoulders, chest, stomach, hips and legs, can be customized along with skin tone, eye colour and hair. Given additional facial, body and clothing data, the Evolver component can be customized and extended to generate crowds of any culture or style.

As always, your feedback is incredibly important to us at Autodesk, Here is how you can help us with this technology:

It’s also important to read the background info from the Autodesk team responsible for this preview tech.

“….Autodesk has been researching the underlying technology behind Project Geppetto for over five years. The technique is more sophisticated than simple blending techniques that result in the awkward and implausible motions used in video games. Project Geppetto is based on a fundamentally new approach to how motion data is processed and applied to characters. Motion data from key frames or motion capture clips are synthesized in such a way that variations of the original performances can be interactively applied with a high degree of quality. The process of working with the data is akin to training your characters with performance repertoires. We're calling this collection of motion that gets processed an "Ocean of Motion" to represent how different our approach is.

Importantly, the approach we've taken is not specific to human motions or to crowds. Given the right data, Project Geppetto could control dogs, snakes, dragons, cars, etc, and with a more directorial itinerary-based interface, Project Geppetto could be used to block in individual "hero" animation. Because the technology is data-driven its capabilities are limited only by the amount and kind of motion data to which it has access.

Project Geppetto technology can currently solve some of the following problems:

Agile responses: Real-time triggering of physically believable agile motions, such as quick turns. This is required for collision avoidance and navigation.

Object interaction: Seamless and natural real-time interaction with objects in the environment, such as sitting in chairs, or stepping up to climb stairs

Intelligent "human-like" dynamic obstacle avoidance: The perception of potential collisions and subsequent evasive actions must mimic the response times and behaviour of real human beings.

Intuitive crowd orchestration: New methods will be introduced that allow artists to directly control the flow and interaction of traffic patterns.

Ease of use plays a major role in the design of the intended workflow. Characters are orchestrated in an intuitive high-level fashion through the manipulation of flow patterns, goals, and designated behaviours. The tools are geared toward controls that are fun to use, and accessible to "non-animators," but not at the expense of serious artistic control.”

So how can I download this?

Visit the Autodesk Labs site by clicking here. The download link is at the top of the main page. If you’ve never used Project Geppetto before you should visit this “getting started” section.