Im trying to render some terrain, this works very well, but im having troubles using normals.I just figured out how normals work and stuff, and worked out to calculate normals for my terrain.However because an triangle only has 3 points = 3 normals, lightning appears to be really blocky with the rough edges:

I dont know if this is normal or caused by an error in my code.Im calculating an normal for each vertex, and draw evrything with an vbo (using an trianglestrip).

So is this normal behaviour, and how can i smooth these edges?I have googled a lot, but the solution seems to be some shader magic (with also a lot of choises about the algorithm to use).Thank you for any help.

The underlying problem is that your terrain is blocky. Why did you use decrete height levels?

Well it was just my first try, i created an hightmap using perlin noise, and created some vertexs to render the hightmap.What would be the best way of rendering some random terrain then?Using more rectangles or something?

I think Riven meant that the height-map values seem to be rounded to discreet values. If the terrain wasn't made out of discrete height values, it wouldn't look that blocky. Increasing the resolution won't really fix the problem that easily. You're basically getting aliasing when the height goes x --> x --> x +1 --> x+1 --> x+1 and vice versa. You need to make it more smoothly go from one value to another:x --> x+0.25 --> x+0.5 --> x+0.75 --> x+1or something. In short, don't round the values to ints. =S

The vertex normals need to be computed from the surface normals of all the shared tris. It'd be helpful if you show a top-down wireframe view as some patterns work better than others. Should look like squares with inscribed X's or diamonds at all levels.

I think Riven meant that the height-map values seem to be rounded to discreet values. If the terrain wasn't made out of discrete height values, it wouldn't look that blocky. Increasing the resolution won't really fix the problem that easily. You're basically getting aliasing when the height goes x --> x --> x +1 --> x+1 --> x+1 and vice versa. You need to make it more smoothly go from one value to another:x --> x+0.25 --> x+0.5 --> x+0.75 --> x+1or something. In short, don't round the values to ints. =S

Omg! now i noticed im rounding the values to ints (depthmap is an float array, so i didnt get what the problem was).Thats what i get for reusing my code lol.

when removing the (int) cast the terrain looks like this:

Thanks for the help so far =)Much better but there are still "squares with inscribed X's" like Roquen says.What would be the best way to fix these (like creating some kind of circle or blurring it until there are no x's)?

Does it have any effect when you normalize the normal in the fragment shader instead of the vertex shader?Just try'd both, but i dont see much diffrence, i dont even see diffrence if i comment the normalizing out.Normalizing is just making sure the vector is between 0 and 1 am i right?Since im normalizing the vectors before passing them to the shader, i guess it has no use anyways.

A normal always has a length of 1. That's an assumption that must hold for the dot products to return correct values. Even if you normalize the normal in the vertex shader, after linear interpolation over the surface of a triangle it won't have have a length of one. You can visualize the "valid" normal values as a sphere with a radius of 1. When we linearly interpolate between two normals, we won't be following the surface of the sphere (that's called slerp and is a lot more expensive), we'll be taking a shortcut straight through the inside of the sphere between the two normals. That's why you need to normalize it again. Granted, the difference isn't very big when the normal changes slowly over the surface, but there is a big difference when the normal changes quickly from vertex to vertex since the interpolation took a faster shortcut almost straight through the sphere.

I hope that explains it. And also, normalize() is blazingly fast since it's optimized to be called for every pixel. You also shouldn't have to normalize your vertices' normals since they should already be normalized when you upload them to your VBO.

A normal always has a length of 1. That's an assumption that must hold for the dot products to return correct values. Even if you normalize the normal in the vertex shader, after linear interpolation over the surface of a triangle it won't have have a length of one. You can visualize the "valid" normal values as a sphere with a radius of 1. When we linearly interpolate between two normals, we won't be following the surface of the sphere (that's called slerp and is a lot more expensive), we'll be taking a shortcut straight through the inside of the sphere between the two normals. That's why you need to normalize it again. Granted, the difference isn't very big when the normal changes slowly over the surface, but there is a big difference when the normal changes quickly from vertex to vertex since the interpolation took a faster shortcut almost straight through the sphere.

I hope that explains it. And also, normalize() is blazingly fast since it's optimized to be called for every pixel. You also shouldn't have to normalize your vertices' normals since they should already be normalized when you upload them to your VBO.

Thank you, i get it now."You also shouldn't have to normalize your vertices' normals since they should already be normalized when you upload them to your VBO."That was the part about renormalizing i didnt get, because i already send them normalized, but the story about interpolation made it very clear.

Even if your vertex normals are similar it often helps to re-normalize in the fragment shader. The difference isn't that much when you're doing nice smooth diffuse lighting, but for something like specular highlights you'll only get the nice, tight, sharp highlights if you re-normalize.

It was the nicest seamless texture i could find, in my way to program the shader, it have had many diffrent shades of green in my journey to create the shader Already fixed it, i had diffuse set to 2.0 instead of 1.0

Now i need to get multitexturing to work and add water, what would i do first...

You might be interested in techniques like texture splatting and soft water edges then? =S

Indeed.However it seems most engines use an huge alpha maps to determine each texture for multi-textured terrain.As first try im going to pass an byte as attribute with each vertex containing the terrain typeI hope its possible to interpolate between the types some way, otherwise ill try the alpha map (128*128 image for each chunk does not seem to bad).

I have got another question, to go with the flow with the story, i will post it in my old thread.The terrain rendering is coming along nicely, but im having an problem with interpolating in the shader.

* dirt & grass image are stored in the same texture, hence why im using +0.5 (take part 2 of texture)** mix can also mess up textures with some combinations, need to find some replacement for that one to, but i guess thats easyer then my first problem.

Also, that's not how you usually do texture splatting. Let's say you have 3 different terrain types. 0 is grass, 1 is dirt, 2 is snow and 3 is rock. What if you want to do a transition from dirt to rock or snow to grass? With your code it's only possible to do smooth transitions between adjacent types, which limits its use case a lot. Instead, I'd recommend using 4 different values, one for each type. Bytes should be enough for it (256 different values). In the fragment shader, just sample all 4 of them (should be cheaper than branching anyway) and blend them based on the 4 values. At all times should the sum of the values equal 1.0 of course, you might have to manually ensure that since the values are interpolated. Anyway, the point is that this will give you fine control over how the terrain types are blended.

Also, that's not how you usually do texture splatting. Let's say you have 3 different terrain types. 0 is grass, 1 is dirt, 2 is snow and 3 is rock. What if you want to do a transition from dirt to rock or snow to grass? With your code it's only possible to do smooth transitions between adjacent types, which limits its use case a lot. Instead, I'd recommend using 4 different values, one for each type. Bytes should be enough for it (256 different values). In the fragment shader, just sample all 4 of them (should be cheaper than branching anyway) and blend them based on the 4 values. At all times should the sum of the values equal 1.0 of course, you might have to manually ensure that since the values are interpolated. Anyway, the point is that this will give you fine control over how the terrain types are blended.

Thats clear, since i already had an float array im just using an float for siplicity while testing.Like you are saying im indeed planning to use some bytes for multiple textures.

Assuming 1 < t < 2, dist will be between 0 and 0.6. If t <= 1, then the texture doesn't get mixed. Shouldn't dist range from 0 to 1?

You are correct when i leave this behind, they are smoother, but are still really pointy.multiplying by 0.6 exerragates this effect, so its easyer to show

multiplying with 1.5 gives the best effect, but its still not good enough

If you scale the boundary between transitions, you still need to normalize the "mix" range. Given the lack of information, there is no way to know the origin of the problem if that is not it. But regardless of what the original issue is, there will be extra artifacts if that problem is not fixed.

So each vertex has an value of the texture (float terrain) and gets passed to the fragment shader (float t).if t > 1 then mix the second texture on the base texture.As experiment im trying to merge multiple textures into one (so many textures next to eachother), im just telling this to explain the "missing" textures, it works like it should so dont bother about this .

However the sampling of the second texture gives very hard edges (dist = (2-t) * 1.6; gives a little smoother effect, but still not good enough).I think this effect has the same problem as with the normals, since the triangle only has 3 values to interpolate between.So is there any way to interpolate nicely between those values in the fragment shader like with the normals (from very blocky to perfect round)?

If you scale the boundary between transitions, you still need to normalize the "mix" range. Given the lack of information, there is no way to know the origin of the problem if that is not it. But regardless of what the original issue is, there will be extra artifacts if that problem is not fixed.

I didnt get the part about normalizing, c is the value passed to gl_FragColor.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org