I'm pretty new to shaders, so I'm a little bit stuck. What i'm trying to do *should* be easy. I'm trying to get the current pixel and multiply it by another color to get kind of a blue hue over the entire screen (not including the HUD). The end goal is to kind of have a night type feel to everything by doing this.

It seems like every example I am finding uses the SpriteBatch class with LibGDX or Slick to handle passing texture data to the shader. I already have a large portion of rendering code in place and I'd like to find a way to do something like this:

That would be in my main game loop. Basically I have a simple shader working now, but all it does is change every pixel to a specific color. All of the individual sprites are drawn in each of those methods in that loop. It doesn't seem possible that I can just get the "current" pixel color value when the fragment shader is run sadly...

Is there an easy way to handle shaders without re-writing all of my existing rendering code and over-complicating the systems I already have in place? I guess another option might be that I could apply a shader to the entire buffer before rendering the HUD? I don't know - any and all suggestions are welcome. More code available upon request - didn't want to overcomplicate things since I have basic shaders already working.

You don't "pass" a texture to a shader. You bind a texture and pass texture coordinates for each vertex. The vertex shader passes the texture coordinates down the pipeline to each fragment, with the value interpolated between the vertices. In the fragment shader, you then need to sample from the currently bound texture at unit N by using the texture coordinates for the given fragment.

Firstly; ARB is old school. Use GL20 instead. Secondly, your rendering code shouldn't need to change dramatically. You just pass colors for each vertex, along with position and texture coordinates. Then you pass the color attribute along to the vertex shader like so:

1 2 3 4 5 6 7 8 9

attributevec4a_color;varyingvec4v_color;

voidmain() {//pass input color to frag shaderv_color = a_color;

... outputgl_Positionandpasstexcoordsalong ... }

(If you're using newer GLSL, then it's "in" and "out" syntax)

Then you multiply the current color with the vertex color (which is in your case will be blue). Your vertex shader looks like:

So the way I'm reading this... when you bind a texture to a quad, it should automatically be handed off to the vertex shader where it can, in turn, be processed by the fragment shader?

Do you need to manually modify any uniforms in the shaders for the texture or anything?

Great tutorials, by the way. I was going off the LWJGL ones which use the ARB stuff. I've done a decent amount of 2D game dev in the past, but never really messed with shaders til now; seems like you can do some really cool stuff with them.

First of all the current color value of the screen is not accessible in the fragment shader, but ...

When you just want to tint the screen a bit, you can just render something transparent(alphablending) over the hole screen. Or you could add some uniform to your normal shader which enables disables the night effect for each rendered object.

So I am messing around with the rendering method for individual sprites. I think I'm starting to get an idea of how things'll work, but I still have to bind the TexCoord to the vertex shader as an attribute, correct?

You don't give the texture ID to the shader. Instead, you give it the "texture unit" -- which is by default zero (GL_TEXTURE0). Then the shader will sample from whatever texture is bound at the specified texture unit.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

//specify the active texture unit//NOTE: this is optional since GL_TEXTURE0 is the default stateglActiveTexture(GL_TEXTURE0);

//this binds the texture ID to the active texture unit, //which we set to be "unit zero"glBindTexture(GL_TEXTURE_2D, myTexID);

I keep thinking about this in terms of me setting an attribute from my Java code in the shader program...

You can pass information to the shaders in two ways: as an attribute per-vertex or as a uniform per batch. Something like texture coordinates needs to be passed as an attribute (glTexCoord2f) since it is different for each vertex.

Like I said -- read up on some tutorials and books about OpenGL. It looks like you are using old-school attributes (gl_MultiTexCoord0) and outdated libraries (SlickUtil), which I wouldn't recommend if you're trying to learn OpenGL in the 21st century.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org