And if you are putting this code in the glView class, then it would just be:

PHP Code:

CGPoint touchLocation = [touch locationInView:self];

Also, just an FYI, if you're making openGL view app, you probably will want to user their template that (now) goes through a view controller. There may be things down the line that you'll want to do that you can't do easily without manipulating the view controller of your view (e.g., present a video).

I've tried [touch locationInView:self] (self is glView), and also [touch locationInView:nil]. Same result. I also tried creating a seperate view, and adding it to my window, just to capture the touches. Same results!

Not so keen to get video in to my game. In fact, it's mostly C++, I try to avoid objective C wherever possible.

Can you recreate it in a stripped down version; i.e., only the app delegate class, gl view class and touch code that registers an NSLog? If so, can you provide this code to examine?

As for a view controller, it's completely up to you. It won't slow down your rendering code any more than it is now and you can leave all the engine and rendering code in C++ even if you switch to the view controller setup. In my experience, if you don't stick with Apple's templates, you'll almost always have some headaches in the future with other things not quite working right. I *think* Game Center would also require you to work with view controllers, for instance, though I'm not sure on that since I haven't yet integrated GC into anything. It is of course your choice though!

(Jul 28, 2011 11:10 AM)alerus Wrote: Can you recreate it in a stripped down version; i.e., only the app delegate class, gl view class and touch code that registers an NSLog? If so, can you provide this code to examine?

Add the following line in MyView.m (at line 167):
NSLog(@"moved : %f %f",position.x,position.y);

Compile and run the app on a device.

(Jul 28, 2011 11:10 AM)alerus Wrote: As for a view controller, it's completely up to you. It won't slow down your rendering code any more than it is now and you can leave all the engine and rendering code in C++ even if you switch to the view controller setup. In my experience, if you don't stick with Apple's templates, you'll almost always have some headaches in the future with other things not quite working right. I *think* Game Center would also require you to work with view controllers, for instance, though I'm not sure on that since I haven't yet integrated GC into anything. It is of course your choice though!

Yeah, I think I'll create a new project using their current OpenGL template, and log out the touch posiitons, see if get more joy with that!

add the value UIStatusBarHidden to true in your info.plist as well to hide the status bar so you can test behind it.

My range is from 0 to 470 and it begins 20 points from the edge on the task bar side and the point 470 is about 10 points from the home button edge. So I guess that's the usable range. However this still gives 480 points in a space of 460 pixels. So wtf?

This prints out correct non-negative coordinates. When trying to click toward the left side I was able to get the x-coordinate down to a value of 1.0; I could not get it to zero, perhaps because you can't actually click on 0. Similarly, the furthest on the right I could get was 319 and the furthest y-coordinate for the bottom was 479. The furthest to the top I could get was 20, because above that is the status bar which disables the clicking.

If we're getting different results that's quite bizarre. Can you try that code I listed exactly in the classic project just to make sure?

Oh, and I did try your moved code as well, but didn't get negative numbers then either :-/

I also tried on the device with similar results. It's much more difficult to hit the bounds (I believe because on the device it's doing an averaging of all the points being touched), but still no negative numbers.

I get the same values with your output as with mine. Add the value UIStatusBarHidden to true in your info.plist as well to hide the status bar so you can test behind it. I can only get values on initial tap from 0 to 470 and then at the 0 side I can drag another -6 points to the edge. It won't detect a tap in that region so your code won't show anything below 0.
And it works normally on the simulator. This problem is only occurring on the device itself.

"Apple is aware of the problem, and a bug report has been filed. This is possibly related to the OS's tendency to register touches several pixels above the physical contact point. It's not always possible to see values at the very extremes of the device due to a lack of sensors beyond the edges of the screen.

For the time being, you could work around the issue by offsetting the touches back to the expected range. Make sure to key that offsetting code to the version of iOS on which they determined the bounds, since that will change when/if Apple fixes this bug."

In practice I've found it best not put small touch areas near the edge of the screen, and/or to allow them to detect touches that are "close enough" (i.e. adding an invisible touch detection area around the button.)