Comments

I don't see why not, it's 'just' a shader after all, so it'll depend what settings you use on each prop/model etc

Not an animation of course, but just had to throw some primitives in with default Genesis and applied using the default DT-Projection (I should've used one of the soil options underfoot instead of grass, but it's my first test)...

The shader is linked to the camera in regards to it's texture location, so moving objects or moving the camera in animation will make it look like the texture is moving (zooming the camera in shouldn't change anything). It can be a limitation but it can be used for some cool stuff too, were you to make a heavily displaced rock golem and move him around it would animate the rubble and make it dynamic.

This is a test I did during production, before I removed the Z axis from texture projection (why zooming doesn't change anything). Here I'm getting the effect rotating around the figure which doesn't happen anymore, but it would look pretty close to the same if I were moving the camera to the left with the character parented.

But if the Genesis figure lowers its arms, do the diagonal lines in the texture follow the arms, or will they keep running in the same direction?

The simple way to think of it is to use a slide projector and a man in a white suit as an analogy. If you have the man in white stand in front of a white screen and project an image onto both of them, the textures in the image will appear to wrap around the man's shape and then continue on the screen behind him. However, if the man moves his arms, the projected textures will remain fixed in place relative to the background, so they will appear to slide around and up and down the arms as he moves. (And needless to say, that same sliding motion will occur even if you replace the white background with a black one, which is the closest quick equivalent I could think of for projecting the image on just the figure and not the rest of a scene.) If you look at the product illustrations where more than one item has the same texture applied, you can see that all lines and textures in the projected image on the figures follow in the same basic pattern as the background figure. Great for blending two separate figures into a single one, as DT suggests, but not for animation under most circumstances.

Now, I say MOST because what ISN'T clear is how tightly the project surface is locked to a specific plane and perspective. If one can lock them selectively, this technique could come in VERY handy is in projecting images onto set of primitives that roughly approximate the contours of the original image to create quick virtual sets, so that 3D figure could "stand on" or "touch" items in the original image plate, or a slight pan or dolly effect, with perspective shifts, could be introduced on what was originally a flat image. (In Visual Effects terns, this is called a 2.5D matte painting, and being able to do one inside DAZ.Studio would be pretty snazzy.)

The shader is linked to the camera in regards to it's texture location, so moving objects or moving the camera in animation will make it look like the texture is moving (zooming the camera in shouldn't change anything). It can be a limitation but it can be used for some cool stuff too, were you to make a heavily displaced rock golem and move him around it would animate the rubble and make it dynamic.

This is a test I did during production, before I removed the Z axis from texture projection (why zooming doesn't change anything). Here I'm getting the effect rotating around the figure which doesn't happen anymore, but it would look pretty close to the same if I were moving the camera to the left with the character parented.

That said, can you comment on my thought about using it for 2.5D? It looks like building a virtual set out of primitives would work as long as there was no Left-right/up-down camera movement, but is there a way to "lock" the direction that the projection is theoretically coming from in order to do a slight repositioning of the image?

If you look at the product illustrations where more than one item has the same texture applied, you can see that all lines and textures in the projected image on the figures follow in the same basic pattern as the background figure.

The settings between materials that blend together in the promos I've done need to have the same settings for that to work (not just the same texture). Were you to use different tiling, diffuse color or translation settings etc for objects with the same texture they won't blend together.

I'd thought about the idea of clamping and I'm not sure it's possible (unless it's just over my head which could be). It's something I would have liked to have a switch for best case scenario. I'll certainly keep it in mind thought because it's an effect I'd like as well. There may be other ways to do what you want too, setting the shader up to work with Shader Baker would mean you could export the textures to UV but that wouldn't be so dynamic.

That said, can you comment on my thought about using it for 2.5D? It looks like building a virtual set out of primitives would work as long as there was no Left-right/up-down camera movement, but is there a way to “lock” the direction that the projection is theoretically coming from in order to do a slight repositioning of the image?

I'm not quite able to picture the setup you have in mind with the 2.5D, what would be getting animated in that situation? DAZ Studio seems to be able to animate certain shader values (though confusingly), if you were able to freely keyframe the values used for the texture's translation you should be able to compensate for any camera movement. Moving the camera up/right/left/down would just need opposite movement in the shader, rotation around an object should even be possible with X and Y making the image plane parallel but I'm not sure what that would look like detail wise.

If you look at the product illustrations where more than one item has the same texture applied, you can see that all lines and textures in the projected image on the figures follow in the same basic pattern as the background figure.

The settings between materials that blend together in the promos I've done need to have the same settings for that to work (not just the same texture). Were you to use different tiling, diffuse color or translation settings etc for objects with the same texture they won't blend together.

I'd thought about the idea of clamping and I'm not sure it's possible (unless it's just over my head which could be). It's something I would have liked to have a switch for best case scenario. I'll certainly keep it in mind thought because it's an effect I'd like as well. There may be other ways to do what you want too, setting the shader up to work with Shader Baker would mean you could export the textures to UV but that wouldn't be so dynamic.

Well, it can all be done pretty easily in AfterEffects, and to a certain extent a lot of it can be done with billboards. However, there are certain advantages to doing it in studio, the biggest of which would be that you could set up primitives in the shape of an object in the image plate, animate your figure interacting with it, then go back and re-run the pass with the shader replaced with a base white or neutral shade. Presto, a perfectly mapped shadow pass to add in AE. Or one could could take a completely two dimension image like a painting or still photo, and add a 3d door that a character could open and walk through, or introduce a subtle perspective shift to it as the camera dollies in, all in one pass..

That said, can you comment on my thought about using it for 2.5D? It looks like building a virtual set out of primitives would work as long as there was no Left-right/up-down camera movement, but is there a way to “lock” the direction that the projection is theoretically coming from in order to do a slight repositioning of the image?

I'm not quite able to picture the setup you have in mind with the 2.5D, what would be getting animated in that situation? DAZ Studio seems to be able to animate certain shader values (though confusingly), if you were able to freely keyframe the values used for the texture's translation you should be able to compensate for any camera movement. Moving the camera up/right/left/down would just need opposite movement in the shader, rotation around an object should even be possible with X and Y making the image plane parallel but I'm not sure what that would look like detail wise.

Ouch. Sounds like something that would be easier to do in a program that has motion tracking... What I was thinking of was doing something similar to this, http://www.youtube.com/watch?v=2xSvdbcEWr8 but with far less extreme camera moves. Maybe a shift of ten-fifteen degrees laterally or vertically.

Yeah that example is basically what I would want clamping for myself, and I believe that baking would handle that pretty well (his textures are obviously baked since they're showing in 3D view as he rotates). I'd imagine this being hard to pull off because of other limitations though, there'd be no way to see how the image lines up onto your mesh outside of rendering. Placement of your primitives would involve a whole lot of test rendering without a guide in 3D view. It's pretty far out of scope for what I wanted this particular product to be (it's a set of shaders at it's core which was originally intended just for UberSurface). I very much would like to get something going like you show but it will take a lot of time in a different direction. My interest comes from wanting to mask out parts of my HDRI panoramas so that trees etc cast accurate shadows and objects can pass behind them.

Let me put together a few clips showing how the translation and animation stuff actually behaves so it's obvious what's going on.

I get the general idea of projection mapping and how it ignores the UV map, but the promo examples show the texture (including displacement) wrapping around the curves of the surface, rather than smearing across them. How is it doing that? Will it work with shadows and ambient occlusion?

(It is extremely cool-- I'm a bit overspent at the moment, though....)

I get the general idea of projection mapping and how it ignores the UV map, but the promo examples show the texture (including displacement) wrapping around the curves of the surface, rather than smearing across them. How is it doing that? Will it work with shadows and ambient occlusion?

(It is extremely cool-- I'm a bit overspent at the moment, though....)

Thank you!

The textures are coming from the camera, but things like displacement and bump still rely on the surface geometry for their direction. So it's not displacing towards the camera, it's getting displaced along the surface normals like normally mapped textures would be. It does work fine with shadows and AO as well, this along with specular etc is part of what keeps it taking the shape of geometry rather than just showing up as an image overlay.

The textures are coming from the camera, but things like displacement and bump still rely on the surface geometry for their direction. So it's not displacing towards the camera, it's getting displaced along the surface normals like normally mapped textures would be. It does work fine with shadows and AO as well, this along with specular etc is part of what keeps it taking the shape of geometry rather than just showing up as an image overlay.

That's a point of interest.

Under verifications, it looks being very usefull for animating some visual atmospheric fakes, like clouds moves simulation, as well.
(As I can't afford After Effects CS6, I use generally my antediluvian VUE 7.4 to do that, at the price of nerves breaking HUGE render times).

So my question is:

--> Is there a way to add custom textures to the 107 ones you provide with the shader ?

( I'm far from having got yet comfortable with your SSS tool, so I beg your mercy, please more than one single week long delay between new shaders! LoL)

Good question, Richard. Here is a render showing how it behaves, the image is projected from the front and also projects to the back of the sphere. The sides of the sphere are banded however as the projection is hitting surfaces at an angle there. Hopefully this is understandable, having it project from each direction to cover bands would be nice but I'm not sure how I'd go about it.

—> Is there a way to add custom textures to the 107 ones you provide with the shader ?

Yes of course, there is a blank base shader included along with the texture presets that's basically the same as loading UberSurface. It has no preloaded textures and values are default to give you a common starting point for adding your own stuff.

Well, I thought I was done asking questions for now, but after reading Richard's another just popped into my head:

Just to be sure, you CAN use more than one variation of this shader in the same scene, right? That is, use a rock texture on prop A and a leafy green vegetable texture on the a separate shader. I couldn't help but notice that there aren't any obvious images where this is done in the samples.

I couldn't help but notice that there aren't any obvious images where this is done in the samples.

Sorry if the promos didn't show it well, but there's a few using multiple materials and instances. In the big/small scale renders both figures use projection mapping with different textures. The stone pillar and tree in the chameleon render are different projection materials, as are the rocks and wood in the Treant render. If the materials in the promos aren't being projected they're using the projection shader in UV mode.

Good question, Richard. Here is a render showing how it behaves, the image is projected from the front and also projects to the back of the sphere. The sides of the sphere are banded however as the projection is hitting surfaces at an angle there. Hopefully this is understandable, having it project from each direction to cover bands would be nice but I'm not sure how I'd go about it.

Ah... this image explains a lot. The texture is smearing around the edge of the bend, but the image minimizes this because it's parallel to the camera (by definition).

The only way I can think of to get the same kind of UV-free functionality and avoid the banding problem (and actually, this would work for animations, too, I think) would be to use Ptex instead. (Not that I'm sure I understand how Ptext works!)

So can you apply the UV-free version to part of an object, e.g. a mat zone, or does it have to be the whole contiguous mesh? What happens if there are unwelded parts? Do you have to make sure to select all of them? Or does it go by node?

This really is cool. Hm. I do get paid Friday.... Peanut butter for the rest of the month! :D

Imho "correctly" is the right word. Yes you can, for sure, but is that reasonable ? If you have to render delicate and well crafted reflexions, be sure camera projection is not the most appropriate technique. This technique, as every others, is not a panacée. It is good for some projects, not for other ones.

Camera projection was used and over used by professional in the purpose to obtain acceptable results at lowest costs for movies and games, and though it's replaced today by more sharp techniques (parallaxe objects modelling, vertex texture tesselation, etc...) it returns great results in several cases with few ressources consuming in terms of composition and rendering.

For DAZ Studio users, Camera projections must be perfect for grounds, rocks, hills, trees, walls and street views, and some specific objects. Absolutely not for Victoria.

Imagine you want to build a glorious Spartacus scene, on an old decayed Via Romana looking pavement road. How many millions of polys you have to modelise, How many hours to render them? Just forget it.
You make instead a quick spline object in Carrara, and then "project" on it a Via Appia touristic photo, pre-worked (if I can say) under toshop, and within half an hour you get an "acceptable" 2.5D result, saving a lot of time to spend on your mostly important Freak5 gladiator's pose and realism.

Just remember DAZ's early days "Millenium Environment" celebrity.

And, to come back to reflexions, let's not forget there is somehow a brotherhood with Camera Projection. In some rare situations I remember having used relexions on primitives to resolve some tiling and mapping problemes, of back-camera landscapes and objects.

That being said, I can't wait to download next friday DT's shader for testing this week-end, in the hope he will not invent yet another shader next week, lol.

The only way I can think of to get the same kind of UV-free functionality and avoid the banding problem (and actually, this would work for animations, too, I think) would be to use Ptex instead. (Not that I'm sure I understand how Ptext works!)

So can you apply the UV-free version to part of an object, e.g. a mat zone, or does it have to be the whole contiguous mesh? What happens if there are unwelded parts? Do you have to make sure to select all of them? Or does it go by node?

I don't know that Ptex it's self would help in this situation, as far as I know Ptex isn't restricted by UVs because it breaks UV space down to individual polys or smooth groups. It'd really help retain detail etc while baking because it's essentially spitting out parts of textures it stitches like a panorama at render time, but I'd have the same trouble actually getting the texture onto the mesh. So it's how the texture is being applied that's causing the banding rather than how it's stored. The same sort of thing happens when you paint on perpendicular surfaces in programs like ZBrush.

The effect I'd be going for getting rid of the bands is called Cube Mapping in 3DCoat which is what I use, I'm sure ZBrush has it but I don't know what it would be called. Essentially it would be this same effect in three dimensions, projecting on the X/Y/X axis instead of just Z (coming from above/below and left/right of the camera as well as forward/backward which it does). Adding transparency to the banded areas then laying that all together should look proper, you could multiply alpha through diffuse to get automatic variance so it's not fading perfectly.

If you're familiar with the EYEris shader it has a specular flare effect that offsets highlights from their original position, under the hood changing the value on X would make it move left/right regardless of where the light is. It might be possible to pull the texture three times and put two of them through an offset of 90 degrees on X and Y to get them on the sides before adding them together. Sounds like a pain in the ass but feasible.

As for material zones there shouldn't be any issue applying these to some and not others. Un-wielded parts shouldn't cause much issue, the same or less issue than UV mapping. I'm assuming since the texture is universal the displacement would move the separate parts together rather than splitting. When applying the presets you select which material zones you want them on first, much the same as using UberSurface.

EDIT:

Actually now that I think about, it the 3D fractals I used were originally completely un-wielded on each poly and I had to figure out how to get them together. That was only because of the faceting effect you get from lighting on un-smoothable surfaces though, nothing to do with the shader I don't believe.

Here is an animation showing some behaviors of the shader. I switch from AO only to mostly directional lighting a couple seconds in to show how shadows help divide like surfaces. Hopefully this is a good display of how it works. Remember that any of the 107 materials can be used in UV mapping mode as well, either with this shader or with UberSurface (so you're not stuck with the effect).

Thanks, I thought that was likely to be the result of reflection - but as noted, it should often be possible to work around it or disguise it. Another option might be to use the sight vector as the projection axis, but then a reflection, although not smeared, would be different from the direct view which might be obvious in some cases and not in others.

I would like to ask about the grass shader on this project. On the first promo shot is that the grass shader on the ground? Would love to see it if not. Thinking I will get this, but the grass is what could really sell me on it.

Well you're in luck because I just finished uploading a video test using some grass lol, and yes the main image uses one of the grass presets for the ground.

I did this short clip to show a couple more concepts. Here I'm essentially using meshes as masks for the shader. There's a half sphere under the ground plane parented to the one above which I've applied the grass shader to. Since the shader isn't stuck to UVs moving these objects doesn't move the texture, so the grass texture is stationary and only it's shown area changes.

I'm also showing how texture rotation works on the ground plane, that checker texture is actually projected. Rotating the texture backward makes the shader apply textures at an angle, letting me compensate for perspective by decreasing pattern size in the distance. I have the same rotation on the grass shader to make sure it's detail adjusts as it moves closer to the camera.

Is there a Directions for Dummies anywhere for this product? I can't seem to adjust the tiling. X Tile, Y Tile and Z Scale don't seem to do anything when I move the slider up or down...I'm certain it's user error (it almost always is), nevertheless, I need help..!

Can you provide a bit more info on what you're doing? The first thing that comes to mind is making sure you're in Projection mode, UV mode and Projection mode have separate tiling sliders. It's also something that's only shown in render, it won't do anything to what you see in 3D view (this is a restriction of how shaders and 3D view talk to each other).

I'm currently working with someone on Q&A type documentation to add, hopefully that will help cure some confusion concerns :)

Well, I thought I'd start with something simple, so I applied the shader to a plane. I suppose I should have realized I would only see tiling changes in a render. That said, I am quite clueless about how to make sure I am in projection mode vs. UV mode...

Hi, I just bought this set, and every time I apply a shader it's very flat even on a hilly terrain... with no shadows or anything.
I can't find any documentation on the product or how to adjust the shader for acceptable results.