Texture in OpenGL ES 2 looks pixelated

I suspect it is because I need to add retina support or something, I tried adding [self setContentScaleFactor:2] in my GLView, but it doesn't work, it just makes the screen purple.
Also I set the min and mag filter to linear.

I'm sure it's just something stupid that I'm missing... The texture's dimensions are powers of two...

Something is weird: My mainScreen bounds are 320 X 460. Is that supposed to be like that? It runs now, but only on the top left quarter of the screen.
Changing the content scale factor is the only thing I'm supposed to do?

BTW, when I'm running the simulator, when my device is the normal iPhone it shows a picture of an iPhone4, so I thought that it would simulate retina support, but it doesn't, and when I choose iPhone (retina) it show's an iPad, but it actually simulating a retina iPhone :| not a big deal but I'm wondering if it's like that for everyone or is it just me.

(Aug 26, 2011 09:29 AM)vunterslaush Wrote: Ok fixed it. I changed glViewPort and I multiplied by 2 the width and height.
Does that make any sense or my programming is way off?

Multiplying by 2 will be correct for the specific case you're writing for at the moment, but isn't particularly futureproof, and would mean you'd have to use different code depending on whether you're running on a retina display or not. That's fine if you're only writing for iPhone 4, but if you want your code to run on a iPad or an older iPhone (or a newer one with potentially a different resolution), you may want a more general approach. I do it like this:

Another comparison screenshot, maybe? The UIImageView in your previous one looks like simple linear magnification to me, so I'd expect GL to do just as good a job of it...

(Aug 26, 2011 09:10 AM)vunterslaush Wrote: BTW, when I'm running the simulator, when my device is the normal iPhone it shows a picture of an iPhone4, so I thought that it would simulate retina support, but it doesn't, and when I choose iPhone (retina) it show's an iPad, but it actually simulating a retina iPhone :| not a big deal but I'm wondering if it's like that for everyone or is it just me.

Yeah, the simulator's window doesn't look like an iPhone in retina mode for some reason. It should be the correct size for a retina display, though, whereas the iPad simulator window is somewhat smaller with a wider aspect ratio.

(Aug 26, 2011 10:32 AM)ThemsAllTook Wrote: Multiplying by 2 will be correct for the specific case you're writing for at the moment, but isn't particularly futureproof, and would mean you'd have to use different code depending on whether you're running on a retina display or not. That's fine if you're only writing for iPhone 4, but if you want your code to run on a iPad or an older iPhone (or a newer one with potentially a different resolution), you may want a more general approach. I do it like this: ...

Yeah I looked at your code, thank you very much! Actually I multiplied it by [UIScreen mainScreen].scale, or should I use what you said?
I guess what you did is more compatible because my solution keeps the same width/height ratio... Well I don't plan on developing on the iPad because I don't have one, but I'll keep that in mind.

(Aug 26, 2011 10:32 AM)ThemsAllTook Wrote: Another comparison screenshot, maybe? The UIImageView in your previous one looks like simple linear magnification to me, so I'd expect GL to do just as good a job of it...

Of course:Screenshot
BTW, after second thoughts, I enlarged a bit the GL texture so it will be more similar to the image view version, and it does look pretty identical. except for that black outline sorta thing. How can I fix that?

BTW, ThemsAllTook, thanks a lot for everything!!! You are helping me big time, I owe you!

(Aug 26, 2011 10:59 AM)vunterslaush Wrote: Screenshot
BTW, after second thoughts, I enlarged a bit the GL texture so it will be more similar to the image view version, and it does look pretty identical. except for that black outline sorta thing. How can I fix that?

Aha, you've hit another classic problem. Short version: Your texture appears to be using premultiplied alpha; use glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) and the black halo should disappear.

Longer version: A bitmap image with premultiplied alpha has its R, G, and B components scaled by the A component, and has to be composited differently in order to be drawn without semi-transparent portions looking darker than they should. For example, 50% transparent red would be 0x7F00007F premultiplied, rather than 0xFF00007F unpremultiplied. This has some advantages, though; when pixels are interpolated due to image scaling, transparent color values mingle with opaque ones in a way that can cause black or white halos (depending on a few things) around edges in a nonpremultiplied image. Premultiplied images (when composited correctly) categorically avoid this problem, because the math works out correctly when transparent pixels blend with opaque ones.

As for why your pixels are premultiplied, I presume you're loading the texture with UIImage or some other Apple API? Apple's image loading functions will typically premultiply your data for you, though in some cases you can disable it. If you want to load the image without premultiplying, you can (maybe? haven't done it in a while) change the options you use to load the image, or use libpng directly.

(Aug 26, 2011 11:51 AM)ThemsAllTook Wrote: As for why your pixels are premultiplied, I presume you're loading the texture with UIImage or some other Apple API? Apple's image loading functions will typically premultiply your data for you, though in some cases you can disable it. If you want to load the image without premultiplying, you can (maybe? haven't done it in a while) change the options you use to load the image, or use libpng directly.

Yeah, I'm using UIImage CGBitmapContext. (Everything is copied and pasted from here)
Should I bother with it? Why do I care if my images are premultiplied if they are shown correctly? What are the benefits?

One last question which is probably more related to my previous thread-
If I manually sort my objects/sprites by depth, and then call glDrawWhatever in that order, can I ditch depth test and the depth buffer?

(Aug 26, 2011 12:15 PM)vunterslaush Wrote: Should I bother with it? Why do I care if my images are premultiplied if they are shown correctly? What are the benefits?

In the vast majority of cases, it doesn't matter; for loading images you want to display with OpenGL, premultiplied alpha is almost always what you want. The only situations I can think of where you'd really need nonpremultiplied alpha are if you're encoding data into a bitmap format for something other than display (say, a terrain map of some sort) and need all four channels to do so, or if you're editing the image and you need to preserve colors that might be destroyed by 0 alpha. Premultiplication is technically a lossy operation, so it's good to be aware of what it is and does, but for simply displaying the image, the lossiness isn't relevant.

(Aug 26, 2011 12:15 PM)vunterslaush Wrote: One last question which is probably more related to my previous thread-
If I manually sort my objects/sprites by depth, and then call glDrawWhatever in that order, can I ditch depth test and the depth buffer?

New questions:
I've started a new project from the OpenGL template from Apple,
and it automatically supports both retina and non-retina displays?
And another question- I edited the shaders to support texture, and I'm playing around with it to see that everything is ok, and of course, that's not the case: I have two quads, same texture (a grayscale shaded ball), one is setting all the color vertices to green, and the other ball is red, but all the alpha values of the corner vertices are 0, so I expected the red ball to not show at all, however, that's what I get:Screenshot

I'm using glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) and glEnable(GL_BLEND), these are my shaders:

I added that at the vertex shader and it worked! Thanks!
Just so that I'll understand- Is that ok? When I'm loading the textures I'm premultiplying the alpha, so in order to make everything work that line that you wrote is the standard thing to do? Or I'm doing something wrong and I should change it?