Hybrid View

Simplex noise in GLSL

Because my original Perlin noise thread went a bit off-topic after 20 replies, here's a new thread announcing the very first (to my knowledge) implementation of Perlin simplex noise in a fragment shader.

Simplex noise is better looking AND faster, and a true derivative can be computed without much extra work. I kept the classic noise in there too for comparison. The code is also cleaned up and commented some more.

As before, I'm developing on really low end hardware, so please post your frame rates if you have anything better than my GeForce 5600XT.

I scrapped the teapot, that code was CPU limited and didn't do the shader justice. Just start the program, look at the rotating noise-textured sphere and tell me what the FPS counter in the title bar says.

On a side note, that reference implementation Java code from Ken Perlin was REALLY hard to understand! I ended up reading his paper instead and doing everything from scratch.

Re: Simplex noise in GLSL

Im not currrenly in front of my GeForce 6800 GT but will run the tests later today. Id like to comment on nvidia implementing there GLSL compiler alongside their CG compiler. if you develop on nvidia hardware and then try it on an ati you always get things you overlooked. looking at the language spec nvidia does not have a tight implementation, ati agrees more with the actual spec.

maybe a bit pedantic, but should we not lobby nvidia to have a tighter to the spec implementation so cross development to us developers is easier ?

the advantage to the nvidia developer is that some cg parts and names are included, but i think i would prefer a tighter implementation ?

Re: Simplex noise in GLSL

Originally posted by paintor:Id like to comment on nvidia implementing there GLSL compiler alongside their CG compiler. if you develop on nvidia hardware and then try it on an ati you always get things you overlooked.

We're using NVemulate to force strict compiler warnings, so we're able to catch any spec violations. Really helpful and does the job (no errors on ATIs).

Re: Simplex noise in GLSL

Sorry about that silly ATI crash. It's so very easy to forget that promotion from "0" to "0.0" does not happen automatically in GLSL when Nvidias compiler just silently accepts it. I for one would appreciate it if Nvidia tightened up their GLSL conformance, or at least made their non-standard extensions to GLSL something you had to ask for, not something you have to go to great lengths to avoid having.

I have corrected the bug. Now it should work also on ATI and 3DLabs cards.

Re: Simplex noise in GLSL

"bobvodka", could you please try the 2D noise, both simplex and classic? Code to do this is in main() in the fragment shader, but commented out.

The lines starting with "n=" are in order: 2D classic, 2D simplex, 3D classic, 3D simplex. Comment out the 3D simplex version and try the 2D variants instead. You should be able to run at least the 2D noise on the ATI 9800, because it now has no dependent texture lookups at all. (Note that the code changed a few minutes ago.) If this does not run in hardware either, please tell me.

(When you pick another noise function, you will get warnings that some uniform variables are no longer found, but that's OK, the 2D noise is not animated and does not use the "time" variable, and classic noise does not use the "simplexTexture" sampler1D.)