Ptex 3D texturing becomes a reality at SIGGRAPH

A new age has dawned for 3D texturing thanks to the appearance of Ptex. Ars …

In my 3D modeling and texturing article, I mentioned that a lot of the time involved in 3D texturing is spent dealing with UVs, the coordinate system that all 3D applications use for applying textures to models. It's not a good system because you have to manually create them, like dressing a model with a flat cloth and some scissors, so UV-mapping complex shapes is very tedious. Then you have the problem of seams, especially when bump and displacement maps are involved. And often you have to redo UVs at the end of sculpting because they have been stretched and compressed from the movement of polygons. So you're then forced to bake your textures from a bad-UV model to a good-UV model leaving you with a mountain of cruft of old meshes, new meshes, old textures, new textures. It's just a headache all around.

This is where Ptex comes in. Developed by Brent Burley at Disney Animation Studios, Ptex generated a ton of buzz a couple years ago with its simple promise: no more UVs and no more headaches. It was like someone saying “self-cleaning apartment”—everyone wanted in. With Ptex, textures are parametrically stored per polygonal face and there are no visible seams.

The famous drool-worthy image released by Disney showed Ptex's automatic and seamless texture coordinate system in action:

Rawr.

If that's hard to appreciate, look closely at the top base model: the displaced bumps are actually texture tiles, joined at the edge of each face—with no seam. It's kind of like a person laying pieces of arbitrarily-sized tiling, but you can't tell it's not one piece of bumpy floor.

If desire had a file extension, it would be .ptx

And there was no user interaction to create the map—you simply told the app to make a Ptex texture and painted away. Other perks of the Ptex format include support for subdivision surfaces, 8-bit, 16-bit and floating-point image data, metadata, and the ability to store an arbitrary amount of textures (color diffuse, bump, normal, specularity, etc.) within one .ptx file. It's cruft-proof on so many levels.

After Ptex was made open source last year, it was simply a matter of time before it made it into packages. Here at Siggraph 2010, it's clear that time has finally come. First came to Pixar's PRMan 15, then 3D-Coat support for Ptex arrived a couple months ago, and Houdini 11 added Ptex support with yesterday's release. Wednesday, Pixar and Autodesk reps were also showing demos of a Mudbox alpha with Ptex. The Foundry's Mari (the texture painting app that was used for Avatar) also has some Ptex integration in Mari 1.0.

The Foundry at NVIDIA's booth showing off Mari, the mother of all 3D painters.

Considering that Mudbox is an Autodesk product, it's fair to say that the next release of its animation/rendering applications will support Ptex too. It can't come too soon and I can't wait for the day we can finally say "in the olden days, we used these things called YOOOOUUUU VEEEES."

Still not sure what a UV actually IS, but it all sounds cool and I look forward to the awesome animations that will come from this.

UV refers to texture coordinates- how a texture is spread over the surface of a 3D model. Texture coordinates what, for example, makes the face portion of a space marine's skin appear on the part of the 3D model that corresponds to the front of his head, rather than the back of it or his chest or crotch or foot or whatever. It's relatively easy to place the texture on something simple like a box or a sphere, but it quickly gets very complicated when dealing with objects with creases, ridges, protrusions, fine details, etc. Imagine having to giftwrap the dinosaur in the article to get some idea of the challenges involved.

I've been really happy with the Ptex support that came to 3D-Coat in January. It's really nice to be painting an area, realize you need higher res. just in that spot, so you increase the res. on just the needed polygons and continue painting. With current UV maps you're pretty much screwed if you need more res. in one spot while painting. It's also really nice that it exports to any 3D app, not just the ones that have Ptex support.

Mari doesn't have PTex in 1.0 (which was released last week), but it will appear in a later version. The Foundry just licensed Disney's Paint 3D system, where Disney initially developed PTex painting, and will be incorporating that into Mari. See....

Houdini has always been awesome, its just that due to its pretty hefty price and the nonsensical UI and workflow in the early it doesn't get as much attention as the "big 3". But if you want power, great support and a very responsive company, look no further than sidefx and Houdini. Its truly in a league of its own.

Ptex won't work for games. I don't understand all of the details but it has to do with each poly getting it's own UV map, so this means quadruple the number of verts for each object. Even suggesting it on the Unity forum had people talking to me like I was crazy.

Ptex won't work for games. I don't understand all of the details but it has to do with each poly getting it's own UV map, so this means quadruple the number of verts for each object. Even suggesting it on the Unity forum had people talking to me like I was crazy.

Is it possible to generate a normal UV map off of a PTex? If so, the artist could do all their work in PTex and then just export to UV for game engine purposes once complete unless I'm missing something.

Ptex won't work for games. I don't understand all of the details but it has to do with each poly getting it's own UV map, so this means quadruple the number of verts for each object. Even suggesting it on the Unity forum had people talking to me like I was crazy.

Is it possible to generate a normal UV map off of a PTex? If so, the artist could do all their work in PTex and then just export to UV for game engine purposes once complete unless I'm missing something.

3D-Coat will export Ptex as UV maps or it can be baked to a new UV map. I don't know how the others plan to do it.

3D-Coat will export Ptex as UV maps or it can be baked to a new UV map. I don't know how the others plan to do it.

If that's the case then I suppose Ptex support in the game engine itself is more or less unnecessary no? Or are there things PTex can represent that can't accurately translate back to a normal UV?

The way I understand it would work, but it would not be practical just because of having so many vertices would slow down the game performance. I'm not a real expert on game making so I'm just going by what others have said.

Anything Ptex represents is the same thing you'd see in a normal UV map.

Isn't seem impressive as it's being made out to be and pretty sure it'll be more frustrating than helpful. Imagine if you change a model's geometry, but you liked an earlier version's texture. There's no way to draw from an older Ptex for the updated model because texels are linked statically and there's no mention of a way to figure out model differences of subdivision surfaces to filter out defunct texels.

With UV, you could merge the two based off map differences either manually or using numerous tools available.

So, thanks but I'll stick to Blender's vertex painter for now to color models and create displacement maps.

Ptex won't work for games. I don't understand all of the details but it has to do with each poly getting it's own UV map, so this means quadruple the number of verts for each object. Even suggesting it on the Unity forum had people talking to me like I was crazy.

UVs are still going to be the way to go for games for the foreseeable future. This is really just an end-user thing but it has a lot of appeal because a program like Photoshop could use Ptex to keep the hassle of dealing with UVs away from their customers who just want to paint on a bottle for a product shot and not learn why their texture is stretching all weird or learn why they'd need to reparameterize their UVs. It's going to have huge implications for user-friendly 3D.

Isn't seem impressive as it's being made out to be and pretty sure it'll be more frustrating than helpful. Imagine if you change a model's geometry, but you liked an earlier version's texture. There's no way to draw from an older Ptex for the updated model because texels are linked statically and there's no mention of a way to figure out model differences of subdivision surfaces to filter out defunct texels.

With UV, you could merge the two based off map differences either manually or using numerous tools available.

So, thanks but I'll stick to Blender's vertex painter for now to color models and create displacement maps

I'm only familiar enough with all this stuff to understand what everyone is talking about, I don't model myself, but your comment sounds like people complaining that GUIs are useless because the command line is more powerful.

sorry - I'm not sure (there's no booth here at SIGGRAPH for blender). I know they are hard at work on blender 2.5 and this wasn't slated for that release as far as I know. It will likely come in a point update after 2.5 is released in a few months

UVs and (manual) skinning - the bane of my (admittedly limited) 3D experience. Anything that makes either or both easier, gets a vote with my wallet or attention.

I can't imagine having to skin characters before animating them, that would drive me nuts.

And that is exactly what I have to do every single time when dealing with importing/exporting video game models with **** in game tools. Freaking nuts. Seriously, most "tools" are random bits of software code sloppily slapped on by game developers and released as a "product", or (most of the time) there are absolutely no tools, and third-party people have to reverse-engineer the file format from scratch.

Still, one of my favorite things about AT is reading others geek-out in very specific fields. It's like seeing lights pop up--in colors I wasn't even aware existed (if that makes any sense..).

Picture it as being the difference between wearing a carefully-tailored suit and going naked but painting your body with fingerpaints (note: not trying to make one sound better or worse, there. Naked-but-painted can be a better option in many situations.) To fit a suit to someone so the fabric doesn't bunch, stretch, or have its pattern run the wrong way takes a fair amount of effort ... it can look really good, but you have to know how to sew and what the fabric's doing. Slinging paint at your skin, on the other hand, doesn't take much effort - if you want your nipple blue, you poke at it with blue paint on your finger.

For this situation, there's rather a lot of heavy lifting going on behinds the scene to make it possible for you to say "That bit right there? Make it blue." without having to do the polygonal equivalent of custom-tailoring a suit.

Interesting. In regards to games peak performance isn't everything. Artist productivity and IQ are important as well. Models are sometimes tweaked for the purpose of UVs, and with a move toward displacement mapping the overhead may be worth it to prevent all sorts of artifacts. Great IQ on a 15K model with DM will look better than a 60K model with all sorts of issues. You see diminishing returns anyhow so the issue is getting detail into the model that looks clean. Then again right now so many games run such low resolutions + have significant shader aliasing (especially specular) that cleaning up the current assets would go a long way to improving IQ.