Cool uses for vertex/pixel shader?

Cool uses for vertex/pixel shader?

What are some cool applications of Vertex or Pixel shaders that you've seen?

I recently followed some advice given to me on this forum, making a particle system using Vertex and Pixel shaders. The results were incredible. My particles moved on a 2d plane and bounced around inside a bounding box. They had velocity and drag and their alpha and size were controlled by how fast they were moving.

In "CPU-land" with simple sprites and spritebatch, I could maintain about 10,000 particles on screen before the FPS dropped below 30fps (CPU at 100%). A decent amount. But with the GPU, I could get somewhere around 13 MILLION particles on screen without the FPS dropping below 30. I'll make a video and a small writeup if anyone is interested. I used a few different tutorials on GPU particle systems to create my solution, so it's nothing too special.

So this got me thinking. What other applications are there that can take advantage of the fast and parallel nature of a GPU? Perhaps state machines for thousands of AI bots?

Re: Cool uses for vertex/pixel shader?

So this got me thinking. What other applications are there that can take advantage of the fast and parallel nature of a GPU? Perhaps state machines for thousands of AI bots?

AI bots are challenging, because you would have to read the data back to the CPU at some point in order for it to actually influence gameplay. Reading back data from GPU to CPU is tough to do without stalling the pipeline, which loses the parallelism that made the GPU fast in the first place.

The most successful GPU techniques tend to be those with a one directional data flow from CPU to GPU (maybe persisting information in rendertargets across multiple frames once it is on the GPU), and then from GPU to the screen, without ever flowing back to the CPU.

Some examples:

Ripping water (run a physics simulation on the GPU)

Hair or cloth animation

Raindrops running down a window pane (use a cellular automata to calculate how they move, then generate opacity, normal, and refraction maps by running 2D filters over the raindrop position texture)

Vegetation: trees sway in the wind

Clutter - use a vertex shader simulation to animate huge numbers of non-gameplay objects (things you cannot collide with that are only included for visual interest), for instance falling leaves, rats that scurry around the floor in a dungeon, schools of fish in an underwater game...

Re: Cool uses for vertex/pixel shader?

I'd do the simulation part basically the same way as a GPU particle system. Then when you come to render, instead of just drawing point sprites, use a GPU instancing technique to draw each particle as a 3D model. If your particle system leaves the current state in a texture, your instancing shader can use vertex texture fetch to look up the current transform from the appropriate part of that texture.

Re: Cool uses for vertex/pixel shader?

So this got me thinking. What other applications are there that can take advantage of the fast and parallel nature of a GPU? Perhaps state machines for thousands of AI bots?

AI bots are challenging, because you would have to read the data back to the CPU at some point in order for it to actually influence gameplay. Reading back data from GPU to CPU is tough to do without stalling the pipeline, which loses the parallelism that made the GPU fast in the first place.

The most successful GPU techniques tend to be those with a one directional data flow from CPU to GPU (maybe persisting information in rendertargets across multiple frames once it is on the GPU), and then from GPU to the screen, without ever flowing back to the CPU.

Some examples:

Ripping water (run a physics simulation on the GPU)

Hair or cloth animation

Raindrops running down a window pane (use a cellular automata to calculate how they move, then generate opacity, normal, and refraction maps by running 2D filters over the raindrop position texture)

Vegetation: trees sway in the wind

Clutter - use a vertex shader simulation to animate huge numbers of non-gameplay objects (things you cannot collide with that are only included for visual interest), for instance falling leaves, rats that scurry around the floor in a dungeon, schools of fish in an underwater game...

Lots of great ideas. I'd love to give a whack at some more advanced physics simulations. I can totally see how a particle system could easily lead to a cloth sim.

I understand what you're saying about the one way path from CPU to GPU. But perhaps it would be good for AI that the user doesn't interact with. For example: In the GTA games, the car/people AI around you only exist within a few blocks of your character. If they go too far, they essentially disappear. In GTA3, it was even worse as cars would disappear if you looked away :P But essentially the feeling of a "living city" only exists around your character and you lose a little bit of consistency.

What if you used the GPU to keep track of a million pedestrians and cars on the streets. When they move out of your character's "interaction bubble", they are offloaded to the GPU to update and keep track of. When the player sees them again, those particular agents can be taken out of the GPU and handled on the CPU. Although there is still that GPU -> CPU issue, it would be as minimal as possible. Perhaps a texture lookup every few seconds (depending on how fast the player moves)

In GPU mode, it could apply some basic movement patterns to the agents, keeping track of their positions and attributes (since the user won't see these anyways). Then when loaded into CPU mode, the agents will do more complex things like signaling for turns and talking to other agents, etc.

Of course, the value this brings to a player in a game like GTA would be up for question, but this is just a thought experiment :)