Re: GLSL noise fail? Not necessarily!

Your benchmark is 25% lower than mine on the same hardware and software (MacBook Pro, GF9400M, MacOS X, 1280x800 fullscreen). Did you run the demo on a single screen? Mirroring, or just having a second display active, tends to slow down the display subsystem rather a lot on MacOS X. An earlier post on page 4 of this thread contains my results.

Re: GLSL noise fail? Not necessarily!

Yes, I ran it on a single screen, same resolution, same gfx, OS: MacOSX Snow Leopard, but my Macbook is not the "Pro" version. Perhaps there are slight differences in the CPU speed. I will try again assuring that there aren't any background application that could slow down the system.

Re: GLSL noise fail? Not necessarily!

For some reason, that "Fragment shader compile error:" shows up on many platforms, although the error message you get when you ask what went wrong is an empty string. On some systems I have tried, the "error" reported is even "Shader successfully compiled", so I think the notion of when to signal an error is kind of hazy to many GLSL compilers.

Re: GLSL noise fail? Not necessarily!

I have not read the recent source code of the program and this is only a suggestion.But the program should not rely for shader compile failure on the info log. But it should rely on the compile status.

Re: GLSL noise fail? Not necessarily!

That 0->0 mapping is not a problem. It is perfectly alright for a permutation to have one or even several fixed points that map to themselves, as long as they do not appear in a too regular pattern.

The permutation is a permutation polynomial: permute(x) is computed as (34*x^2 + x) mod 289.
This is one of the two neat and original ideas in Ian's implementation. (The other one is the clever generation of gradients.)
You can read about permutation polynomials on Wikipedia.
It is not a new idea in mathematics, it is just new for this application. A proper journal article on this noise implementation is on its way, but please have patience.

Re: GLSL noise fail? Not necessarily!

The 2D simplex noise was just optimized some more. I replaced a division with a multiplication and removed one multiplication and one addition by introducing two more constants. The speedup I see on my system (ATI HD4850) is about 5%.

The level of hand feeding you need to do to optimize GLSL code reminds me of C compilers from the early 1990's.

Re: GLSL noise fail? Not necessarily!

...The speedup I see on my system (ATI HD4850) is about 5%. The level of hand feeding you need to do to optimize GLSL code reminds me of C compilers from the early 1990's.

I'm curious if this was your general GLSL experience with ATI, NVidia, and Intel drivers, or just regarding ATI drivers in particular.

With NVidia, I've been amazed at how much complexity/infrastructure you can stack on top, but yet how effectively it aggressively throws away things and transforms the code into something very efficient.