I’ve got an issue where my camera jitters when I rotate left or right. It’s random so pinpointing the problem has been difficult. Using PIX I can see that there is a drop in frame rate when the jittering occurs. It’s not a constant jitter it’s like a frame or 2 are randomly taking longer to render. Right afterwards there is a jump in frame rate according to PIX but since I lock my FPS to 60 in my render loop I do not notice the upswing.

What is even more peculiar is that I don’t have to render anything to reproduce the issue. Just the camera in the world, no mesh, no sky, etc and when I rotate left or right, randomly the frame rate dips. If I don’t VSync and I don’t lock my FPS down the jittering (lag) is more noticeable. I don’t see an issue moving forward or backwards, only rotating.

I have a damping mechanism in place for my camera movement. I thought it might be the culprit so I removed everything in my camera class except the rotation’s calculations. I’m not at my desk so this is pseudo code:

Likely the jitters are most obvious when rotating the camera, but not directly caused by the camera itself. They often occur when you're not using a valid enumerated swapchain discription (DXGI_MODE_DESC / FindClosestMatchingMode() etc.). Does the debug runtime/layer give you any warnings on performance penalties?

@eppo - I'm rendering in window mode in DX9 using Discard as the swapeffect and only the implicit swapchain.

How do you get your user-input values in order to rotate the camera? Is it possible that the user-input code is causing your frame-rate issues?

@Hodgman - I think you're on to something here. The movement of my camera is additive, meaning the end user can constantly pass data in. My render loop bails for the window's message queue when there are items that need to be processed. This would explain the delay / lag / FR drop.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.

My render loop checked for window’s messages and bailed the loop until the message queue was empty and then picked back up again. This normally resulted in one frame being dropped every so often- depending on the end user’s key input. The random jittering (stuttering) resulted from measuring the length of the render inside the loop and not taking the dropped frames for processing the windows messages into consideration when moving objects. Since I’ve added the timestamp to include the time it takes to clear the message queue, I’m able to smooth out the animation so it’s not noticeable.

In most cases the ultimate solution would be to use DirectInput to process peripheral communications. In my situation, DirectInput is not an optimal choice.