Simple OpenGL 2d test - nothing displaying

I'm sure there is something simple I'm missing here, but I can't seem to spot it. I'm just trying a simple program to draw a triangle to the screen, but the display just shows black. When I put some simple debug in the drawView method, I can verify that 'drawView' is getting called, but I don't see anything drawn to the screen in the iPhone simulator.

Here's the code I'm using.

First, I'm basing the project off of the GLGravity example. In AppController, the 'setupView' method looks like this:

On a glance, I don't see anything obvious except that your vertices are just a couple pixels away from each other and your projection matrix is presumably set to the size of your screen in glOrtho, which is many pixels. IOW, what you're drawing may be a tiny triangle on a big screen. Try making your vertices more spread out or use glScalef(100.0f, 100.0f, 1.0f), or something like that.

Hmm...interesting. First, what did you mean by 'glEnable2d' being a typo? Because all of the things in the 'code' blocks in my original post are copied directly from the code.

But one thing I did notice was that there was a gl error being generated I hadn't noticed before.

I tracked it down, and something isn't right with the glOrthof call, because I'm getting an openGL '501' error after that line.

I think the real problem is that the vPort array isn't being set correctly for some reason. After the call to glGetIntegerv(GL_VIEWPORT, vPort), vPort[2] and [3] are both zero. That doesn't seem right....

My glEnable2D() method is in a separate .c file, but that shouldn't matter....

First, check for errors. API arguments looks good to me, so let's assume there aren't any.

Next, inspect the state. AnotherJake's conclusion is reasonable-- this is likely a problem with your vertex transformation. So, what's the state that could interact and cause the problem?

Here, you need to understand how transformations work, which is explained in great detail in the documentation, and examples given in the wiki. To summarize:

Each vertex position you submit is treated as a 4-component XYZW vector, defined in Object space, which is measured in whatever arbitrary units you want to use.

It is multiplied by the concatenated ModelView and Projection matrices to transform from Object space to Clip space.

After clipping, the resulting vertex positions are scaled and biased by the Viewport, to produce final window positions.

That's a very generalized transformation system, designed for 3D graphics. For a 2D game, here's how you'd typically use this system:

You set up an orthographic projection, covering the whole screen. For example from [0,0] to [320,480].

You set up the Viewport, covering the whole screen again.

You specify vertex positions, using screen pixels as the coordinate system. For example a square from [10,10] to [20,20].

The positions get squished down to [-1, 1] during transformation, for clipping purposes.

After clipping, they get multiplied by the Viewport scale to produce (in the special case of orthographic projection) the same values you fed in-- [10, 10] to [20, 20].

If you compare these steps with the code you posted, there are two problems. Your input vertex positions are very small, like AnotherJake said. And you never called glViewport, so if you check the state, it's probably zero (unless some other code snippet you didn't post has already set it.) Like the documentation says, the viewport is initially zero, but is set to the width/height of the first window the context is attached to. This is a window system convention, and on the iPhone, there are never any windows, only FBOs. So it's up to you to set the viewport.

You can always do all of the math involved here manually to doublecheck the results you'll get-- you provided all of the input data (matrices, viewport, and vertex positions), so you can multiply it yourself for debugging.

SaxMan Wrote:My glEnable2D() method is in a separate .c file, but that shouldn't matter....

The gl prefix is reserved for the OpenGL API namespace. It's not technically "wrong" that you used it, but it is not proper coding practice, and forces us to "presume" you made a typo somehow. Without the gl prefix we can instantly identify that it is your own custom function, or at least not an OpenGL call. So it is highly recommended that you avoid using the "gl" prefix for your own functions. The only time it is acceptable is if you're actually writing a replacement for one not included in the API you're using (which might be necessary from time to time with OpenGL ES, but still unusual).

Now, as I'm tracing this bug, I'm running into a very strange problem. Part of the idea with my design is to separate as much (if not all) of the OpenGL calls to separate .c files, in order to make my openGL code as adaptable as possible, with the hope that it wouldn't be too much effort to take just the OpenGL .c files and move them between platforms and re-compile.

That being said, something strange seems to be happening to argument passing as I attempt to do this. Take this simple C function, for example:

If I put the 'testWidthAndHeight' method in the AppController.m file, everything works fine, and I see 320 and 480 printed to the debug console. If, however, all I do is move that function to a separate .c file (graphicsFW.c) I get the following output:

Heh... I was going to suggest making sure you flush / swap your buffer, which seemed to slip between the cracks sometimes when I was starting out. Guess you're taking care of everything already... Good luck!