When in First Person Camera mode, if I look at the edges of the screen while moving the camera around, the world seems to stretch. It's only with a 2-inch border around the screen, and it makes me rather dizzy.

Just seeing if anyone else experiences this, or knows how to get rid of it. Maybe it's my math? I don't think so, as I got my knowledge through a proven, successful book/course.

I use the standard 1.0f for the near plane. I've been changing that, but don't really notice a different at all. 1, to 1000, to -10 doesn't make a bit of difference. Pretty weird.

But yes, the FOV was, I think, the main cause. 60 degrees seemed to be the best, even though I'm still not satisfied. I tried 45, but that was like looking through a scope (like in Halo, how it zooms in).

How do games do it then? I've never noticed this before. Maybe when you have a crosshair and aren't paying attention to the sides you don't notice it. Currently I'm just looking around, but it still shouldn't be this bad.

What I'm concerned about is making a spaceship type game. Warping the sides would suck.

View and proj matrices are held in my Camera class. The view is updated relative to the camera, by extracting its look, up, and right vectors. The Proj matrix is set up using the D3DXMatrixPerspectiveFovLH() function. Near plane is 1.0f, far plane is 1000.0f, the whole thing basically just a default setting.

Maybe it looked zoomed in because I'm used to having it wider. You go from being used to the normal FOV in walking around, to a narrower FOV instantaneously. It did look like I was zooming in with a scope, lol.

But I think I'm just not used to it. I'll keep messing around and try to get something that works. But I guess it's only the camera, and that's always fixable/tweakable, so I can worry about that later.

I use the standard 1.0f for the near plane. I've been changing that, but don't really notice a different at all. 1, to 1000, to -10 doesn't make a bit of difference.

Of course not, that doesn't change the projection, only the near clipping (and distribution of z values in the z buffer).

It's a completely natural phenomenon, (video)camera's with wide view lensens experience the same effect. What you do by changing the fov is basically crop the image around the center, and enlarge that across the screen. So a smaller fov reduces the effect, as you move the areas that show the effect outside of the screen

How do games do it then? I've never noticed this before. Maybe when you have a crosshair and aren't paying attention to the sides you don't notice it. Currently I'm just looking around, but it still shouldn't be this bad.

All FPS games have this problem. Of course it varies depending on the FOV that the individual game uses. How noticeable it is may also vary depending on other factors. In other types of games you may not notice it so much (although all perspective projection based cameras have the same effect).

I played some Counter Strike tonight, specifically looking for the eyglass effect. It was there... amazing. I never noticed it before, probably because there are so many things happening at once and you don't take the time to notice things like that.

Another thing that is affected by this is fog. In a game like Americas Army, where alot of the levels have some dense fog, it is a common 'trick' to scout around by viewing with the side of the screen, because it allows you to look a great deal further and get the drop on the enemy. If they just used radial fog, this wouldn't be an issue...