Hi guys, would really appreciate an answer to this as it's driving me mental. The V sync bug in d3 is widespread and there are lots of different fixes for it depending upon what computer you have.

The issue is when the fps suddenly drops from 60 to 30 and a stutter occurs and then it goes back to normal. However, this is my question: when I set it "full screen windowed" instead of "full screen" (with v sync enabled) the fps is solid at 60 and the game runs great with no tearing. As far as I can tell there is no difference between full screen windowed and full screen, so I'm wondering why this is happening and what IS the difference?? Would really appreciate an answer as it seems bizarre. Thanks!

With windowed mode vertical sync simply can't run. - The actual synchronization depends on windows then instead of diablo itself. Only in full screen diablo can take those things over from the window manager.

"Full Screen Windowed" means it is actually Windowed, but the window has been stretched to fit the entire screen space, which looks exactly like Full Screen at first glance, but it's not. Your fps is also capped at 60 while windowed because when you are on the desktop, the framerate matches the screen refresh rate, which is 60Hz in your case.

I'm not sure what bug you are referring to exactly, but the fact that your fps goes from 60 to 30 fps is "normal". When your video card can render at least 60 fps, it will be capped at that, but when it can't, if it dips at 59fps, then with v-sync on it then gets capped at 30, then if you get 29fps, you will be capped at 20fps. So to solve this, you can either lower the visual settings so that your computer will always be able to render 60fps, or you disable v-sync and you'll get exactly the framerate the video card can handle (30+ fps), but you'll have tearing.

The only video cards that can still run between 30-60 fps with v-sync ~"on" are those NVidia cards that are using the newer drivers with the adaptive v-sync thingy. If you are currently using that, then maybe that's the problem. I haven't tried that feature yet but maybe you can disable it in the NVIdia Control Panel and see if it's "smoother".

Ok, thanks that's what I was looking for. So basically running it windowed forces it to run at 60 fps and to match the monitor's refresh rate (which is basically V Sync then?) which results in no tearing. Still makes no sense to me that the ingame V Sync can't do that without the FPS drop though...

Do you know what "V sync" actually means? The question you now asked is simply solved by doing a little bit of thought; "V Sync" means "vertical synchronize", this actually means: "wait for a complete screen refresh before creating the next image".

Without V sync on the program would simply fill the memory of the graphic card whenever it wishes. However the screen is always reading with the screen refresh rate - and drawing it line by line. Now I think you can already see what might happen: as those 2 operations aren't synchronized it may occur that the image changes halfway. So you're rendering 2 images which can cause an effect called "tearing".

Now when enabling vertical synchronizing you only fill the virtual memory each time the screen drawing has been completed. This screen redrawing happens (mostly) at a constant 60 frames per second. So you limit the program to that speed.
However if your PC can't keep up with that, it has to drop to some number that still fits in "60": ie 30 is the first mark. Small example, consider the PC could only do 40 fps - that means 0.025 seconds for each frame, the refresh speed of your screen however is 0.0167 seconds.

Then during the 2nd refresh of the screen the actual data to be drawn would change, as the game has 1 tick changed - tearing.
V sync prevents this, and sees that when reducing the fps further to 30 it could simply draw the same frame 2 times.