If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Method to distort coordinates according to fake velocity field?

I'll preface this by saying I have a basic understanding of OSL and a solid understanding of vector math.

I am attempting to create a turbulent fluid texture that I will use for gas giant planet atmospheres (such as Jupiter's cloud belts and zones) and other fluid-y uses. I found a method implemented as part of a game. Here are some example images: http://imgur.com/a/9LipP

Cameron's texture is some combination of OpenGL and/or C++. I'm not familiar with either of those languages, but I understand the math concepts behind Bridson's method.

I'm trying to implement this technique in OSL for use in Blender. I have created a procedural velocity vector field based on Bridson's method and a function to generate a distribution of particles to move in the texture space according to the velocity field. I currently visualize these particles as white dots with user-defined radii.

I am looking for some guidance in OSL programming at this point in my project.

In a test script I've written, I am able to modify the positions of the particles according to a simple function like sin(x) which introduces a regular wave pattern in the particles' positions. However, when I use the same function to sample my procedural velocity field (vx,vy) the particles are distorted in shape and do not seem to trace the local velocity at each particle position. Instead, I am getting a disconnected sequence of irregular spots.

I've read Michel Anders' excellent book on OSL and the OSL documentation. I find no reference to sampling a point-like variable at a specific x,y. I need a function call something like VelocityField(particle_x, particle_y).

I fear this type of call isn't possible in OSL, but I hope there is some creative way to accomplish the same thing. As I understand it, the sampling of the texture is left up to the renderer, so the specific coordinates within the texture aren't known until the renderer executes the shader. But, as I said, I'm still learning the "OSL" way of programming.

I'm attaching a zip file that contains my blend file, my osl file, a copy of Anders' "haltonsequence.h" file, and an image that illustrates the velocity-tracing effect I'm currently trying to achieve (as a step along the way to distorted texture coordinates).

Perhaps I'm approaching the idea of warping the texture coordinates naively. I'm certainly open to other methods that mimic Cameron's gas giants' appearance or any other method for achieving the look of our Solar System's gas giants.

Hi. I'm the Steve Cameron you're referring to. If your aim is just to make textures, there is a standalone program specifically for that, gaseous-giganticus, which is in my space-nerds-in-space repository on github. The textures aren't actually created in the game -- they're done separately, but the code to do them is in the same repo as the game. Here's a video showing how to use it (it is linux only though): https://www.youtube.com/watch?v=8nx5yPpQh2M

From what you're describing, I expect you want to encode your velocity field(s) as a texture, that is, take the 3d vector at each point in your (presumably) 2d velocity field(s) and encode them into the pixel values of the texture with rgb mapped to xyz. Then you can sample your texture to get your velocity field values. I would strongly suggest getting this working _outside_ the gpu first, as debugging gpu code is pretty tough. Porting this to the GPU is beyond my current abilities, and I wrote the damn thing.

Now one thing you might not have realized about my program is that I am not really just distorting the texture with the velocity field. Instead, there are a bunch (e.g. 8 million, typically) of particles whose colors are determined by some arbitrary mapping to some arbitrary image (typically a blurred vertical stripe of some earth tones works best). A particle's color never changes. Then those particles are turned loose in the velocity field and move around (slowly) while (and this is important) leaving a slowly fading alpha blended trail. Well, that is to say, each iteration of the thing is something like:

1. Move particles
2. Alpha blend each particle into an accumulated image with some opacity (0.5, or something).
3. Fade the entire image towards black, or towards some neutral color, a bit.
4. go to 1.

(I say "the image", there are actually 6 images making the cube map).

So it is not the case that you have a starting texture, apply a distortion via a velocity field in one step, and bam, get a nice texture out. No, you run a little simulation with particles roaming around according to the velocity field leaving alpha blended trails over time. That's what gets you a nice texture. And eats up a bunch of time. Also, the textures tend to start out looking like crap, then they get better and better for awhile, and then after awhile, they start looking weird and terrible. There's a sweet spot where they look nice. Probably not what you wanted to hear, but, that's how it is.

To get a velocity field on a flat texture is pretty easy (see my "curly vortex" project on github). To do it on a sphere, and to do it *seamlessly*, is a bit trickier.

I have 6 velocity fields in a cube-map arrangement, and the "curl of the noise gradient" as it were is computed by the following method:

For every point in my velocity field (each element of my 6 2D arrays) do the following:

1. Sample corresponding noise values in a 3d noise field at 6 axis aligned offsets of each location represented by that cell, detemine the gradient in x,y,z, add them all up to get a gradient at that point. That is to say, sample, noise at:
(x-dx,y,z) and at (x+dx,y,z) subtract and that's the noise gradient in x.
(x,y-dy,z) and at (x,y+dy,z) subtract, and that's the noise gradient in y.
(x,y,z-dz) and at (x,y,z+dz) subtract, and that's the noise gradient in z.
where dx, dy, dz are some small epsilon value.
add those together into a 3d vector, and that's your noise gradient.
2. Project that noise gradient vector onto a plane tangent to the sphere at that point.
3. Rotate the projection 90 degrees clockwise (or counterclockwise -- doesn't matter, just be consistent) around an axis passing through the center of the sphere and the point of interest (rotate 90 degrees within the plane tangent to the sphere at that point). Store this 90 degree rotated vector in the cell in your velocity field -- that is your velocity field value (you might need to scale it a bit) I think in the slide deck I may have said if it's uphill rotate one way, downhill, the other, but this is not right -- just rotate one way and be consistent which way.

The usual method of computing the curl as it is typically described in math books won't work as it is axis aligned, and for each point on your sphere, the tangent plane you need the curl in is different. Not to say there isn't some (unknown to me) hotshot mathematician's way of computing the curl of a spherical gradient with respect to to the surface of the sphere that is better (more efficient) than what I'm doing (which is some crude vector manipulation using quaternions.) My way had the advantage of being comprehensible *to me*, which is a huge advantage. Debugging this kind of thing is pretty hard, because when you get something wrong, it's really hard to tell from the images that come out of it what went wrong -- only that *something* is wrong. I spent about 1 month chasing down a single bug, the symptom of which was that there were huge seams showing at the edges of each of the six velocity fields.

there is a standalone program specifically for that, gaseous-giganticus, which is in my space-nerds-in-space repository on github. The textures aren't actually created in the game -- they're done separately, but the code to do them is in the same repo as the game. Here's a video showing how to use it (it is linux only though)

Sorry to the op if this doesnt belong here, but I have tried to use the standalone program gaseous-giganticus without any luck.
Do I have to compile the program or is it working out of the box.

I am using Linux Mint 17.1 and if I try to compile the program the console gives me following error: fatal error: png.h. missing file

You have to compile the program. There are some dependences. For png.h, it's likely that you need "libpng12-dev"

apt-get install libpng12-dev

You can "apt-cache search blah" to search for what packages match "blah", then 'apt-get install blah" to install them.

I don't know of a nice way to go from "blah.h" not found to --> "apt-get install package-containing-blah.h" other than a bit of educated guessing about what the name of the package it might be in and searching with "apt-cache search".

You probably also want to do "apt-get install build-essential" to get the compiler and headers and so on installed, if you haven't already.

gaseous-giganticus has few dependencies, libpng may be the only sort of non-standard one (ie. not part of libc.) If you want to view what the program creates as a sphere (and not as just 6 square images) then you'll need some program to view them on a sphere. mesh_viewer is also in the space-nerds-in-space codebase -- it has a lot more dependencies than gaseous-giganticus does. The following is a (possibly incomplete) list of dependencies:

Some of those aren't strictly needed for mesh_viewer (e.g. openscad, stgit, libttspico-utils, probably)

So the 6 textures that gaseous giganticus produces for a cubemap are arranged in a particular way (by a convention that I uh, just made up.) mesh_viewer is aware of this convention, but other programs, not so much. So to use those textures with other programs may require some image manipulations to get things in the orientations that such programs might expect (ie. rotations of multiples of 90 degrees in some way.)

From what you're describing, I expect you want to encode your velocity field(s) as a texture, that is, take the 3d vector at each point in your (presumably) 2d velocity field(s) and encode them into the pixel values of the texture with rgb mapped to xyz. Then you can sample your texture to get your velocity field values.

I have a mesh with triangular faces and at the center of each face I have a speed value. My goal is to create water ripples consistent with the calculated speeds. These values ​​are saved in a text file. The resulting texture will be used as a normal map. Can you give me some suggestion?
Thanks very much