Hey guys- this might not be true for all of you, but for the people seeing this behavior in the editor: Do you guys have your scene view and your game view next to each other? I notice when I go from a split view to a single gameview, the Gfx.WaitForPresent disappears.

I thought I should chime in here too because I too have been struggling with this issue for a while and I would love to get an explanation as to why this happens. This gfx.WatForPresent thing seems to eat up all available performance. I need to keep a constant 75 fps for my program, I can get about 60-62-fps in unity. The really weird thing about all this is say I'm at 62 fps and gfx.WaitForPresent is using about 33% of the processor according to the profiler, if I disable much detail and scripts taking up the rest of the CPU time, all that happens is gfx.WaitForPresent jumps to 80% and no speed up is experienced. It really seems like a Vsync type issue to me, but I have Vsync disabled according to Unity. Could we get a new, detailed explanation as to what gfx.WaitForPresent actually does and why people are having these problems? Thank you.

The same happens here in 4.5.1f3, leading to stutters in the camera movement. I've made a plot of the value of Time.deltaTime in excel. Here's a screenshot:

I've marked the normal deltaTime in green. It seems that in some cases the result is presented too late, which leads to a double frame time (marked red.) After that there are often some frames that are quicker and balanced with frames that are slower right after that. Those come in sets (marked yellow) that balance out to the normal deltaTime.

Hi guys, we had stuttering problems with our project with the Gfx.WaitForPresent spiking in the profiler. We found out that disabling the vsync in the project and forcing the vsync in the Nvidia Control Panel fixed the problem (catalyst if you have an amd card). Now our game runs smooth as hell. Looks like there is a problem with video card drivers and vsync implementation in some applications. (it happens in a lot of games, not only with Unity)

The VSync forcing doesn't seem to work on the editor from catalyst, so now I'm running at 4000 FPS and it does still seem to stutter. I'll test with a published exe.

Edit: The published exe runs very smooth full screen. The vsync still doesn't catch on though.

Edit2: If I enable vsync in Unity, the vsync does work and it still runs smoothly full screen and windowed. If I publish to a web player, the stutter returns. So for now the issue only seems to arise in the editor (which is not really important) and the web player. In this case in firefox.

I am having this problem with Unity 5 as well. I am also using a dual monitor setup. The game is stuttering when running it in the editor (Vsync is turned off). Builds of the game show the same behaviour, even when running fullscreen on my primary monitor. I can even see the lagging on the animated Unity splash screen. If it helps, I am using a Gefore GTX 770 with the latest drivers.

Same problem here, I got my vsync off, also, I tried playing and started to disable everything in the scene one by one, the gfx.WaitForPresent was still there using lik 96% and a good amount of ms, even after I disabled everything except the camera... Is this like "system idle time"?

Well the problem was solved for me ( by chance) by deactivating Gizmos inside the Game window.
Gfx.WaitForPresent still appears though but less often and the frame rate went from ~23 FPS to ~950 FPS ( since Vsync is disabled).

What the experiment saying is, as you can see in attached image
The CPU usage in Unity Profiler is showing us all of the task in our computer, not only works in your unity project.
and the name of the stuff in red graph is 'Gfx.WaitForPresent' as all of you said.

It's important that when you try to see the graph of the profiler in other device (other computer or mobile whatever) , each devices will show you different CPU usage graph!

I didn't know Unity is checking all of the CPU tasks in my device, (some people might knew or guessed it but don't have exact assurance about it because didn't seen like this.) thought showing performance only in my project before.

so the unity should give us the profiler what can shows us performance in our project only, not whole process of my device!

Attached Files:

What the experiment saying is, as you can see in attached image
The CPU usage in Unity Profiler is showing us all of the task in our computer, not only works in your unity project.
and the name of the stuff in red graph is 'Gfx.WaitForPresent' as all of you said.

It's important that when you try to see the graph of the profiler in other device (other computer or mobile whatever) , each devices will show you different CPU usage graph!

I didn't know Unity is checking all of the CPU tasks in my device, (some people might knew or guessed it but don't have exact assurance about it because didn't seen like this.) thought showing performance only in my project before.

so the unity should give us the profiler what can shows us performance in our project only, not whole process of my device!

I seriously doubt Unity is profiling cpu tasks outside of itself, just that when using other applications the cpu/cores are obviously being shared, so clearly tasks in Unity will take even longer that normal. Hence why there is a change when you are using Photoshop in your images.

Indeed this is why you should not base absolute performance from the profiler when running the project in Unity. Instead its better to run the project as a build on target platform in development mode and have it auto connect to the profiler to avoid much of the editor overhead that is present. Should also go without saying that all other applications should be closed too.

That is not to say profiling the active project in the Unity editor is useless as it depends on what you are trying to measure, but you should be aware that other applications running in the background will affect it.

I seriously doubt Unity is profiling cpu tasks outside of itself, just that when using other applications the cpu/cores are obviously being shared, so clearly tasks in Unity will take even longer that normal. Hence why there is a change when you are using Photoshop in your images.

Indeed this is why you should not base absolute performance from the profiler when running the project in Unity. Instead its better to run the project as a build on target platform in development mode and have it auto connect to the profiler to avoid much of the editor overhead that is present. Should also go without saying that all other applications should be closed too.

That is not to say profiling the active project in the Unity editor is useless as it depends on what you are trying to measure, but you should be aware that other applications running in the background will affect it.

Click to expand...

I got it, Thanks for the tip.
Yet, I want to see only CPU usage of my project without checking other task on my computer.
Is there a way to remove the graph of them? so that way could show me only the performance in my project?
I think Unity Profiler doesn't have it yet.

If we could check only project performance, It will be easy to adjust many tasks on project effeciently, most of all It makes us need not to check a project each other devices over and over… will be able to get fixed graph regulated.

so I requested about that function to Unity(It might hard to construct on profiler that shows only CPU performance of the project).

I got it, Thanks for the tip.
Yet, I want to see only CPU usage of my project without checking other task on my computer.
Is there a way to remove the graph of them? so that way could show me only the performance in my project?
I think Unity Profiler doesn't have it.

Click to expand...

I don't see how that would be possible since its the cpu/OS that divides up the slice of time each active application gets. I very much doubt that such information is exposed through the OS? All Unity is doing is measuring the time it takes to enter/leave various methods, the fact that they take longer due to multi-threaded applications each getting a small portion of time available is not something that Unity can be aware of.

I don't see how that would be possible since its the cpu/OS that divides up the slice of time each active application gets. I very much doubt that such information is exposed through the OS? All Unity is doing is measuring the time it takes to enter/leave various methods, the fact that they take longer due to multi-threaded applications each getting a small portion of time available is not something that Unity can be aware of.

Click to expand...

hmm.. come to think about it, yeh, I agree with it.
That's probably impossible the profiler of the unity devides the OS's each performance…
Its All can do is just showing us, not classifying them.

um It's hard problem, but as we can see the CPU usage only each programs classified on Task Manager appear with Ctrl + Break, on the other hand, It may be able to get only Its performance somehow… Surely I don't know how,
but I guess some clever guys can gonna make it, hopefully.

guys if you check my comment earlier, this issue happens even on android devices, games fps drops from 60 to 15 fps...

i don't think this is has to do with OS or CPU etc...it was same project, only ported to newer version of unity,, then behold performance drop on same settings... the only way to avoid it in mobiles is to set game to ugly mode (aka vertex lit camera)

guys if you check my comment earlier, this issue happens even on android devices, games fps drops from 60 to 15 fps...

i don't think this is has to do with OS or CPU etc...it was same project, only ported to newer version of unity,, then behold performance drop on same settings... the only way to avoid it in mobiles is to set game to ugly mode (aka vertex lit camera)

Click to expand...

Whilst i'm not totally convinced there isn't an existing problem that gfx.waitforpresent is illustrating, in of itself it is not causing performance problems. All the information we have been given indicates that it is the amount of time the cpu is no longer working for, but instead having to wait for the gpu to become ready to accept further data/next frame data. This makes perfect sense in most cases where you are gpu bound not cpu bound, but there are edge cases where it seems a bit weird.

You mention dropping from 60 to 15 fps, but are you talking about tests on the same device? I.e. in one version of Unity it ran at 60fps on your Android device, but in another version it only runs at 15fps, because it almost sounds as if you are comparing the performance between running in the editor vs running on a mobile device, which will obviously be very different.

Indeed stating that you have to drop the quality settings down dramatically indicates to me that for the device in question it is GPU bound and thus you would see waitForPresent as the GPU is unable to render a frame in the time it takes for the cpu to finish a frame. In your case my first few checks would be disabling v-sync (since 15fps is a multiple of 60hz refresh rate), disable potentially costly effects such as AA/MSAA and other effects etc. Though it might be worth checking the gpu in the profiler first to see if that gives any hints as to where the bottleneck is.

Ultimately though i'll repeat what i've said before. If you have a situation where the waitForPresent duration is excessive, weird or plainly makes no sense, then report a bug and send your project to Unity. Its the only way that UT are going to be able to determine if there is an underlying issue going on, one that waitForPresent is showing up.

Without hard stats ( screenshot of profiler cpu + gpu - ideally running as standalone and linked to profiler) its impossible to say. A few points

1. Does Unity actually support SLI?
2. Rendering to Oculus obviously increases demand on gpu due to potentially doubling the amount of objects being drawn, increasing commands sent from cpu to gpu etc.
3. You mention not using 'direct to rift mode' and problems with v-sync. It is my understanding from playing other games on the DK2 that in non-direct mode your native refresh rate of your monitor will override the rift refresh rate. Indeed after my main monitor PSU blew and I had to switch to another monitor I could no longer run it at 75Hz for Eliteangerous. So instead when playing that game I had to disable my monitor to avoid judder.

Saying WaitForPresent takes 85% of frame time is meaningless on its own. Firstly its not actually taking time on your cpu (as far as I understand it). its just showing the period of time that the cpu is no longer working and is waiting purely for the gpu to catch up (e.g. render latest frame - though as gpu usally render ahead a few or more its more convoluted than that). So really you shouldn't be looking at the cpu stats but the gpu stats to get an idea if something is wrong or not.

If you still believe there is a problem submit a bug to Unity, talking about it is great for general awareness, but its not going to help get UT to fix any issues.

You mention dropping from 60 to 15 fps, but are you talking about tests on the same device?

Click to expand...

of course i meant it that it was same mobile device!
it was Galaxy Nexus.. same device on same project.. direct drop in performance..
and am very aware of mobile devices limitations, avoiding stuff that is commonly known to be performance hungry (complex shaders/heavy scripts/etc), that is why i was able to achieve around 50 to 55 fps in my old project..maintaining a good balance between high fps and just good graphics...
but this issue kept stuck/appearing in profiler + the drop in fps performance on mobile, fps dropped to 15fps.. the only way to make it goes away was to use lowest options in all settings.

of course i meant it that it was same mobile device!
it was Galaxy Nexus.. same device on same project.. direct drop in performance..
and am very aware of mobile devices limitations, avoiding stuff that is commonly known to be performance hungry (complex shaders/heavy scripts/etc), that is why i was able to achieve around 50 to 55 fps in my old project..maintaining a good balance between high fps and just good graphics...
but this issue kept stuck/appearing in profiler + the drop in fps performance on mobile, fps dropped to 15fps.. the only way to make it goes away was to use lowest options in all settings.

Click to expand...

In that case log it as a bug as it sounds like you have a good reproducible project for the problem. Explain the problem in detail, and take your time doing so, as honestly i'm still unsure what the problem is from your description. It sounds as if over time performance degrades or something and you have to toggle quality settings to get it back. Its important as the more assistance you can give Uinty QA the more likely they can find and fix the problem.

of course i meant it that it was same mobile device!
it was Galaxy Nexus.. same device on same project.. direct drop in performance..
and am very aware of mobile devices limitations, avoiding stuff that is commonly known to be performance hungry (complex shaders/heavy scripts/etc), that is why i was able to achieve around 50 to 55 fps in my old project..maintaining a good balance between high fps and just good graphics...
but this issue kept stuck/appearing in profiler + the drop in fps performance on mobile, fps dropped to 15fps.. the only way to make it goes away was to use lowest options in all settings.

Click to expand...

I'm having this same issue, publishing to a Note 4. The game runs smooth as silk for a while but eventually, sure enough drops from 60+ fps to 15 or so and WaitForPresent dominates the profiler. My game is not really GFX intensive. What's odd is that this happens regardless of any interaction with the game. So I could just leave it for 20 mins, come back to it and the frame rate has dropped.

I spent the better half of a week trying to get a perfectly smooth motion for my main character. Aside from the movement implementation details (Physics vs Translates (Lerps vs Sinusoids) and so on and so forth) I was always experiencing the dreaded hiccup - which seemed to be in perfect sync with frames that had high WaitForPresent.

For me what fixed the issue was a combination of

Application.targetFrameRate = 60;

Quality Settings -> Shadows -> Disable

V Sync Count = 0 (Don't Sync)

I haven't spent much time testing after it got fixed so I can't really be sure what made an actual difference, but it might provide some places to search if you're feeling desperate. If I get some spare time I'll see if I can sum up my findings / compare differences.

I mean, if it's shadows and materials that are the issue I guess I have some more optimization to do. But it's really not a GFX intensive game, and if that were the case why would the issue not present itself on the M8?

Well, sure enough it was shadows. Running at 100-200fps now. I have NO idea why real time shadows on literally like 5 cubes would kill the GPU, but then I know nothing about shader and GPU architecture!

I mean, if it's shadows and materials that are the issue I guess I have some more optimization to do. But it's really not a GFX intensive game, and if that were the case why would the issue not present itself on the M8?

Click to expand...

I'm running with a solid smartphone (2GB Ram, QuadCore @1.3GHz & Mali400 GPU) which should be able to handle a game with one cube and a player. Well, when the player has a shadow, it can't

I am fully aware it's the typical case of "you're doing it wrong" but until I figure it out I can at least deploy the rest of my game systems at a constant rate of 50ish fps

Though I might add that even with shadows turned off and everything optimized for mobile, the frame rate does drop from 100-110 to 65-85 after a while. This is probably that 'thermal throttling' right?

I was getting this problem as well, but I found what the issue is on my end. It's 2 things for me: the Scene tab and Free Aspect Resolution in the Editor. I was getting 120 - 150 fps from the Stats readout, and I can predictably get 800 fps now every time. Here's what I had to do:

Note - Simply closing the Scene tab did not work! I think Unity is still drawing that window even though it isn't visible. Setting it to Wireframe did the trick for me, but maybe a different mode might be even faster.

1) Set the Scene Tab display mode to Wireframe view.
2) If you are like me I had the Scene tab and the Game tab docked and next to eachother. Undock the Game tab view so it is a free floating window and scale the window up so it covers the Scene tab window completely.
3) In the upper left portion of the Game view, click the Free Aspect down arrow to show more options. At the bottom of that window there is a (+) where you can enter a custom resolution. Set it to FIXED and set it to a resolution your monitor supports AND a resolution that will fit inside the Game view window. I have a 4k monitor so I can easily have a 1920x1080 frame in view. I believe its important to pick a resolution that isn't being dynamically scaled.

Once I did this I got all my FPS back. gfx.WaitForPresent is still on the profiler list, but its at the bottom where it should be at 0%.

I was getting this problem as well, but I found what the issue is on my end. It's 2 things for me: the Scene tab and Free Aspect Resolution in the Editor. I was getting 120 - 150 fps from the Stats readout, and I can predictably get 800 fps now every time. Here's what I had to do:

Note - Simply closing the Scene tab did not work! I think Unity is still drawing that window even though it isn't visible. Setting it to Wireframe did the trick for me, but maybe a different mode might be even faster.

1) Set the Scene Tab display mode to Wireframe view.
2) If you are like me I had the Scene tab and the Game tab docked and next to eachother. Undock the Game tab view so it is a free floating window and scale the window up so it covers the Scene tab window completely.
3) In the upper left portion of the Game view, click the Free Aspect down arrow to show more options. At the bottom of that window there is a (+) where you can enter a custom resolution. Set it to FIXED and set it to a resolution your monitor supports AND a resolution that will fit inside the Game view window. I have a 4k monitor so I can easily have a 1920x1080 frame in view. I believe its important to pick a resolution that isn't being dynamically scaled.

Once I did this I got all my FPS back. gfx.WaitForPresent is still on the profiler list, but its at the bottom where it should be at 0%.

Click to expand...

Does this look like your set-up? Because the issue still exists for me.

I think I may have solved my issue. I went into my GPU's settings (Nvidia 765m), and cranked all the settings up to maximum performance. Gfx.WaitForPresent is gone, and my framerate is through the roof.

waitforpresent, that should be the device.present as suggested, is actually a good thing...it means you are GPU bound and the CPU at a given point (usually 3 fames ahead) must wait for the GPU to finish. (IT also could mean that you need to optimize your rendering )

Click to expand...

Everyone seemed to ignore this answer.

It explains why turning off VSync sometimes fixes it: Because the CPU was waiting for the GPU.
It explains why turning off VSync sometimes doesn't fix it: Because the GPU might be still taking longer than the CPU.
It explains why random stuff like disabling shadows sometimes fixes it: Because it reduces the amount of work for the GPU.

Some people seem to have understood that WaitForPresent on the CPU indicates a GPU performance problem, but others seem to have totally missed it.

It explains why turning off VSync sometimes fixes it: Because the CPU was waiting for the GPU.
It explains why turning off VSync sometimes doesn't fix it: Because the GPU might be still taking longer than the CPU.
It explains why random stuff like disabling shadows sometimes fixes it: Because it reduces the amount of work for the GPU.

Some people seem to have understood that WaitForPresent on the CPU indicates a GPU performance problem, but others seem to have totally missed it.

Click to expand...

Then how can it be fixed? Gfx.WaitForPresent is taking up 90% for me in a new scene with a few cubes and lights. disabling VSync and disabling shadows both have no effect. I'm getting framerate stuttering and it wasn't an issue in 4.6.

"Unity", Unity logos, and other Unity trademarks are trademarks or registered trademarks of Unity Technologies or its affiliates in the U.S. and elsewhere (more info here). Other names or brands are trademarks of their respective owners.