Troubleshooting Tips for IntelliSense Slowness

Hi, my name is Andy Rich and I’m a QA on the C++ front-end compiler. The IntelliSense system in Visual Studio 2010 comes with far greater power, flexibility, and accuracy, but these improvements come at the cost of greater complexity. The goal of this article is to assist you in troubleshooting this complex system, and give you a peek under the hood at how it works (and what to do when it doesn’t).

The problem is usually PCHs

Having spent a lot of time helping customers with slow IntelliSense, I have found that their performance issues are almost always related to PCH being disabled. For large C++ translation units (and most of the ones that you care about are going to be large), IntelliSense PCH is vital to ensuring fast IntelliSense. Getting your PCH settings right are also vital to having fast builds – so getting this right can potentially be a boon on two fronts. I have previously written a blog post on the PCH model and how to configure it within the IDE: http://blogs.msdn.com/vcblog/archive/2010/01/26/precompiled-header-files-in-visual-studio-2010.aspx. This blog post will be focused on what to do when you’ve followed those steps and things still aren’t working for you.

Start with the error window

In VS2010 RTM, errors in your PCH will prevent the IntelliSense compiler from creating a PCH. This is something we have addressed in SP1, but even still, the error window can be a good place to start investigating performance issues.

One of the new features in VS2010 is “red squiggles” for C++ - these diagnostics are provided by the IntelliSense compiler. These same diagnostics are also provided to the error list window. If this window is not visible, you can bring it up using View->Error List or the hotkey chord “Ctrl +\, E.” In this case, you should be looking explicitly for errors in header files, and starting at the top of the error list window. With VS 2010 RTM, any errors (even ones that the compiler can typically recover from) will prevent your PCH from being built, and cause severe IntelliSense slowness. (This is addressed in SP1, which I discuss in a later section.)

The IntelliSense compiler is not the build compiler

It is important, here, to call out that the IntelliSense compiler is different from the build compiler. We have made every effort to give these two compilers parity. (For more information about how this works with C++/CLI please check this blog post.)

However, there are still differences, and occasionally, a file that compiles without error using our build compiler will not compile properly with our IntelliSense compiler. Often, this is because the IntelliSense compiler has a more strict interpretation of the C++ standard than the build compiler. In these cases, you can usually work around this problem by fixing the error reported by the IntelliSense compiler. (In most cases, the build compiler will happily accept the more-conformant code being required by the IntelliSense compiler.)

Additionally, if you are targeting an architecture other than x86, you may notice that the IntelliSense compiler is always operating in x86 mode. This can produce errors that are very difficult to work around, and while these errors will not prevent you from working with most code, they can cause PCH generation to fail as mentioned above.

If you are unable to find a code workaround for your problems, there is one further stopgap measure that can help: the compiler macro __INTELLISENSE__, which is only defined when using the IntelliSense compiler. You can use this macro to guard code the IntelliSense compiler does not understand, or use it to toggle between the build and IntelliSense compiler.

Context is important

This is a good opportunity to discuss context in our IntelliSense engine. The IntelliSense engine provides accurate results by always having as correct a view as possible of the source file being compiled. This is fairly straightforward in the case of .cpp files: these are natively compiled and understood by the compiler. However, the situation is less clear for .h files, as these files are compiled only in the context of an associated .cpp file.

In previous releases of Visual C++, header files were only parsed by the IntelliSense parser and included in the NCB once, based on the single context they were compiled in. An older post by Jim Springfield discusses this so-called “multi-mod” problem in greater detail. We address this issue in Visual C++ 2010 by having all header files compiled in the context of your current .cpp file, so that this highly contextual information can be more accurate.

However, what is the proper recourse for the IntelliSense engine when an .h file is active in the editor? It cannot compile the .h file by itself – this would not be the correct context. The .h file is almost certainly included by multiple .cpp files – which one should be compiled to get the proper context for the .h file?

In Visual C++ 2010, we introduced a bit of technology called the include graph. This allows us to know, for an .h file, all of the .cpp files that have included that .h file, either directly or indirectly. This gives us all of the possible contexts for the .h file, but we still have very little idea which .cpp file is the one you want.

Ideally, this is something that would be configurable by the user, but this seems heavyweight for IntelliSense. What we settled on was looking through your most recently used .cpp files (controlled by the “TU cache” setting in Tools->Options->Text Editor->C/C++->Advanced) and seeing if any of those were reported by the include graph as being a valid context for your .h file. If so, we use that context. If no such context is available, we must fall back on choosing an arbitrary context for the .h file.

Verify the PCH is being built

Let’s get back to diagnosing IntelliSense issues. Assuming your header files are free of IntelliSense errors, we should look into verifying that your PCH is being built. The most foolproof way of accomplishing this is to actually look on your hard drive for the iPCH file to ensure it is being built.

Browse to your solution directory, and find the “ipch” directory underneath. In here, you should find one directory per project. And within those directories, the “ipch” files themselves. Looking at the timestamps of the files can be informative, but for me, proof positive is to delete the iPCH files (you’ll probably need to shut down your solution first) and ensure the iPCH files are being recreated when IntelliSense is executed on the .cpp file in question.

If you aren’t seeing the iPCH file being generated, this is a good time to go back and review the PCH options blog post and ensure your settings are really configured correctly.

Unless you’re using Makefile projects

One huge caveat is in the case of makefile projects. By and large, settings in your makefile project are opaque to the Visual Studio project system, and therefore by extension, the IntelliSense system. In these cases, your include directories may not be correct, macro defines may be not set, and any compiler switches you are using in your makefile (including those that control PCH!) will not be on.

For these cases, we have added an extra configuration section to makefile projects. Right-click your project, choose Properties, and go to Configuration Properties->NMake. The “IntelliSense” subsection in here is for options that are specific to your IntelliSense compiler. These options will be passed ONLY to the IntelliSense compiler, and should be of the same format that you would pass to the build compiler. You should ideally set these according to the same options used in your makefile. In particular, preprocessor definitions, include search path, and forced includes are important to have right. For our purposes, of course, you should also have your PCH options included in the “Additional Options” section.

As a quick and dirty workaround for PCH, you can often just specify “/Yu” with no parameters, and the IntelliSense engine will create a default PCH for you. But in the long run, you will have better overall performance and less issues if you mirror your build system’s PCH settings here.

Goto-Definition is a very special case

Goto-Definition (GTD) is one of the most complicated operations performed by our IntelliSense engine, and one of the most common to suffer IntelliSense slowdown. The big issue with Goto-Definition is that, typically, the definition of the function is not contained in the translation unit currently being parsed. The declaration is naturally required by the compiler – the prototype in your .h file – but the .cpp file that provides the implementation of this prototype is often not in your current TU; often, it isn’t even in your current project! (And in some cases, it is buried in a static lib or DLL, and no actual code for the definition is possible.)

At a high level, Goto-Definition is implemented like this:

Generate a qualified name for this type (requires a QuickInfo request at the GTD source point).

Search the browse database for all definitions that could match this qualified name.

For each matching definition found, perform a QuickInfo operation to see if the target qualified name matches the source.

If you find a matching definition, stop (don’t keep processing the list).

If you never find a matching definition, show all of the candidates from step 2.

The operation that tends to take a long time is step 3. In a previous blog, I discussed how our preparse model can negatively impact performance. Step 1 is typically not a problem because we have nearly always already generated a preparse for your current source file. In Step 3, however, we are going to a new, unrelated file; and this file typically does not have a preparse generated. The gating factor on the speed of these operations is the preparse, and the only good way to speed the preparse up is with PCH. So getting your PCH working (as mentioned above) is probably the most important thing you can do for performance.

Using Task Manager to pinpoint issues

Sometimes, it can help to pull up Task Manager, as this will provide some insight into which piece of our complicated IntelliSense/browsing system is causing the problem. When you perform a long-running Goto-Definition, you can take a look at which process is consuming CPU cycles. If it is devenv.exe, the problem is more likely in the browsing system (a database query, most likely). This is usually due to some kind of complexity of your solution, and is something we’re interested in finding out more about when you encounter it.

If you find that the process eating up resources is vcpkgsrv.exe, then the problem is in the IntelliSense compiler – and once again, most likely to be a long-running preparse (which is best solved by having PCH turned on and working).

Rich Logging options in Visual C++ 2010

Visual C++ 2010 has some additional logging options which can help to pinpoint problems. To turn this logging on, go to Tools->Options->Text Editor->C/C++->Advanced and change the options under “Diagnostic Logging”. To enable full logging of all events (which can be a LOT of logging data), you set Enable Logging to True, Logging Level to 5, and Logging Filter to 255, as in the screenshot below.

You can view the output of this logging in the Output window (you may need to change the “Show output from:” dropdown to “Visual C++ Log”). Briefly, I will go through what the log looks like for a typical QuickInfo operation.

The first thing to help understand IntelliSense operations is to note the “>>” character, which indicates the IntelliSense engine has placed an item on the worker queue. For the case of QuickInfo, the work is called “CQuickInfoWorkItem”. In this scenario, you can see it took 1ms for the QuickInfo workitem to be created and queued, and an additional 3ms for this item to be pulled off the queue, processed, and the result returned. (This was nearly instantaneous because a preparse for this translation unit had already been built.)

The most helpful part of an IntelliSense log is usually looking at the translation unit the IntelliSense compiler has chosen to satisfy your IntelliSense request. If the IntelliSense compiler is loading this TU for the first time, you will also get output that indicates the command-line options this file is being compiled with, which can sometimes be helpful in diagnosing problems, especially the /Fp parameter (which indicates where the ipch for this file is located).

Note that, in the case of Goto-Definition, because the compiler will probably need to compile multiple translation units in order to provide an answer, you may see multiple “Translation unit:” info statements for a single operation. (Also, if you have red squiggles on, you will see an additional workitem fired off as a result of navigating to a new source file, in order to check for compilation errors.)

Performance Mitigations in Visual Studio 2010 SP1

We have added three improvements/mitigations in Visual Studio 2010 SP1 that are designed to provide an improved IntelliSense experience. These are:

Improved database queries resulting in better class view performance

Long-running Goto-Definition operations now have a cancel dialog

IntelliSense PCH will be created even if there are errors in the header

Of these improvements, the third should give the most immediate benefit to our users. Previously, we would only create an iPCH file if the PCH header compiled without any errors. In some scenarios (especially with non-x86 code) it was impossible to get the PCH header completely error-free, resulting in very poor IntelliSense performance, as the preparse could not take advantage of PCH speedups. (This was most noticeable during Goto-Definition, when it was more likely that the ‘target’ TU did not previously have a preparse created.)

With this feature, we have added a few special diagnostics in cases where we were still unable to generate an iPCH. These are mostly the result of project misconfigurations, missing files, and other such catastrophic errors. The text of the error should say something like “An IntelliSense PCH file was not generated.” If you see these diagnostics in SP1, it is almost certain you will suffer poor IntelliSense performance until the error in the diagnostic is resolved.

Getting additional help

I am always interested in hearing specific feedback about poor IntelliSense performance. My hope is that with the additional diagnostic information provided in this blog post, you can help us pinpoint the performance issues you are having, which component the issues are coming from, and get closer to the actual root cause of the issues. Supplying this additional information (as much as is available) in your connect bug will help us to understand and address these problems.

We all remember that deleting the old ncb file could fix problems. I have found something similar in VS2010. I think this the following tip might be useful to others.

Sometimes closing a single window would takes minute or even causes a freeze, the IDE would go white and unresponsive. Killing and restarting the IDE would not help. I found the solution was to set the IDE to write browsing and intellisense files to a "Fallback location" directory and then to delete that folder when this problem happens. I found this folder would grow to many gigabytes for my solution. No wonder the IDE is slow if it has to read all that.

I think this mostly happens when upgrading to new versions of big 3rd party libraries. We link against lots of big 3rd party libraries like boost and upgrade regularly. The new library has to be parsed by intellisense but I think the old database also lingers on. Deleting the parsing files solves this. Also even if you get rid of the old parsing database, the new library will not be parsed unless you do a "rescan solution". It took me a while to figure that out.

It seems with todays computers its no longer memory or the cpu that is the bottleneck when it comes to compiling with VS2010.. but its the harddrive. Maybe the ncb solution was faster because it was in memory.

Thanks

Asbjørn

29 Mar 2011 12:06 PM

Do you have any suggestions on how to make intellisense non-glacial for projects where I can't set up PCH?

I've tried looking but I can't seem to find any non-intrusive way of enabling PCH, and since this is a cross-platform open source project I can't make non-trivial changes just because my IDE of choice is quirky.

I don't really care about build times, I'd just like intellisense to be a bit more snappy. Is there a solution?

That said in terms of useability the new intellisense in vs2010 is miles beyond the one in vs2008. Now if it could only just be an order of magnitude slower... :)

Lars Viklund

29 Mar 2011 12:23 PM

When you use "x86" in the text, do you just mean x86_32 or x86_64 as well?

Purging old incorrect entries would be quite welcome, as it's quite easy to get stale data, particularly if you have a habit of copying project directories or changing version control branches a lot in the same directory. There's nothing more confusing than doing a Go To Definition and ending up in a completely different directory tree.

I have resorted to obliterating all IS data whenever I check out another branch or update any of my third party libraries, as it will inevitably do the wrong thing otherwise.

I don't have any problems with Intellisense speed, just its accuracy. It produces squiggles and error list entries all over my code base, even though the thing builds fine; "go to definition" even works on words it's squiggled as errors saying they don't exist.

"In Visual C++ 2010, we introduced a bit of technology called the include graph. This allows us to know, for an .h file, all of the .cpp files that have included that .h file, either directly or indirectly."

Can we access this through the IDE? It would be enormously useful when deciding which headers to include in your PCH.

@Asbjørn is your project set to reference the per-user property sheet? If so, does it work if you set up precompiled header & force include settings there?

Asbjørn

29 Mar 2011 10:12 PM

@Andrew: Thanks for the tip, however I have reasonable control over the solution/project files, it's the source code I can't change too much.

However having to verify that the code compiles without PCH before committing to the repository seems to severely negate the benefits of a faster intellisense. If I set up to only create the PCH file but never use it, will intellisense benefit from it then?

Asbjørn

29 Mar 2011 10:36 PM

Not to spam, but intellisense does indeed seem to benefit from a PCH even if there are no users.

I set up a local my_pch.h file where I included "everything under the sun" and a my_pch.cpp file which includes it, and set it to create the PCH on that file only. No other changes to the project was made. No other files in my project refers to my_pch.h.

Now intellisense is just as fast as in vs2008 and the source code unchanged.

Which makes me wonder, why can't the IDE do this by itself?

PleaseFixYourBugs

30 Mar 2011 1:14 AM

Thanks for the article, however, it doesn't help a whole lot. Here's how your advice applies to our team (we develop desktop apps, 99% of our code is C++), if you are interested:

"Turn on the PCH." - Done years ago in every project. We have been using PCH files since they were invented. IntelliSense is still slow, with performance gradually dropping the longer you are using the same session of Visual Studio.

"Check if the IntelliSense compiler fails to create a PCH." - Sure, checked, it can. Many of us are regularly cleaning the build folders manually specifically to "reset" the build process and check whether or not PCH files and various other things are being rebuilt correctly. No problem.

"Guard code that fails to compile in the IntelliSense compiler with the __INTELLISENSE__ macro." - As I said, our PCH files are being built fine. That said, if they weren't, we *wouldn't* alter our code to please the IntelliSense compiler. The code is complex enough as it is, adding macros for things like IntelliSense is a no-go.

"Tell the IntelliSense what compilation options you are using if you have a Makefile project." - Sure, of course, we are doing that.

"It should be better in SP1." - We don't see this at all. The option to cancel a long GTD operation isn't what we want, we want the operation itself to be faster.

You see? Nothing in the article seems to help.

I understand that you can't do anything regarding the poor performance of IntelliSense on our code without having more knowledge of what kind of code that is, etc, but, frankly, I doubt our code is all that different from what other teams have. Did you try using IntelliSense on the codebase for, say, Office? Yes, there probably is a lot of work involved in redoing the project files, but I think this would be a worthwhile exercise. In our experience, right now, IntelliSense becomes unbearably slow when a project grows out of its infancy. A couple hundred files, and more often than not, IntelliSense starts struggling. A thousand files, and half of the time it is faster to actually look up the definitions of function prototypes manually rather than with IntelliSense. Trying to use IntelliSense on a day to day basis in a large and evolving project like Office is likely the only way to really fix it.

Now, if I had a choice, I would first fix problems with the debugger, but...

Karen

30 Mar 2011 2:15 AM

I think Intellisense often feel slow because the editor will not auto complete when it is re-parsing source file.

e.g. If I add a new function in the header, then switch to the cpp file, the reparse will begin. Until the reparse complete there'd be no intellisense. In Visual Assist X, they let you autocomplete with the OLD database while the parsing is in progress. Yes, it's sometimes inaccurate, but being able to complete function/variable name that exists before the last edit is very very useful.

Sergejs

30 Mar 2011 3:29 AM

In our project we have some 200+ MB of generated source cpp/h files and we are periodically regenerating them. As a result, sometimes we need to wait 10-20 minutes after VS 2010 starts so it can finish parsing all the files for Intellisense. We are not using PCH.

Is there a way to disable Intellisense for these generated files? Putting their content inside #ifndef __INTELLISENSE__ ... #endif didn't help - Intellisense still parses all the generated files on VS 2010 start-up if I delete the .sdf file.

XAML

30 Mar 2011 6:46 PM

xaml aint cxx but man, it is so slow editing anything beyond a trivial xaml page - aintnosense drags it all waaaaaay down. It needs to be taken out back and shot in the head - twice.

jamome

30 Mar 2011 7:31 PM

I also find the VS2010 UIX sluggish, and noticeably more so than VS2005. I didn't start using 2010 in time for the user satisfaction survey last year, but after plugging away with 2010 now for months, I can safely say 10 is *NOT* the next 6.

Hopefully VS2010 SP2 will ship and improve performance and quality. If not.... I guess I will have to (again) decide if I'm $pending to upgrade to VSNext, or ditching VS for another editor (technology stack). grrrr

Christoph

31 Mar 2011 4:45 AM

On my machine (Core2 Quad 8GB RAM, Win2008R2), VS2010SP1, latest patches etc, even a freshly built iPCH for a moderately complex solution (9 projects, iPCH 200MB, devenv.exe needing 200MB RAM with 500MB peak, vcpkgsrv.exe needing 40MB RAM with 250MB peak) would need about 2 minutes (!) for a GTD query. Even subsequent queries for the same symbol would sometimes show the same behaviour after leaving the machine idle for an hour, although mostly not. Most of this 2 minutes, devenv would show very little CPU and memory activity (with vcpkgsrv being idle) and only in the very last few seconds would vcpkgsrv utilize 100% CPU (of one core). A full blown diagnostic only shows one thing that is suspicuous regarding [ClassView] (assuming the numbers are tick counts), although I never use class view (the window is not open):

If true, the GetList2 alone would already take 108 seconds. In this form, it is really next to useless for me and my team will jump for joy when we make the switch to VS2010. GTD (and related) is widely used here. Any ideas? Thanks.

petke

1 Apr 2011 10:29 AM

To add to my first post. It seems deleting the "Fallback Location Folder" files was not the fix to when you get the "close all documents" hangs the IDE. Deleting the "sdf" that is created in the same folder as your solution is. I don't know why this file is not also redirected to the Fallback Folder.

Quite frankly the IDE is embarrassing buggy. That's a shame as the compiler is very good. I was the one who drove the company I work at to upgrade to VS2010 instead of 2008. I have apologised for this reckless decision of mine. But there is no going back now.

We have not yet installed SP1. I'm hoping that will solve some of these IDE issues.

PleaseFixYourBugs

2 Apr 2011 12:12 AM

@petke:

Don't get your hopes high with SP1. We have moved back from using straight VS2010 to using just the compilers and using VS2008 as the IDE, but we have, of course, tried VS2010 SP1 to see what changed. We were thoroughly disappointed. Apart from the fix for the Find in Files box that was previously growing a bit every time you opened it, and the offline Help viewer, which is still *worse* than what we had in VS2005 / VS2008, but still better than what there is in plain VS2010, we couldn't find a single problem that was bothering us in plain VS2010 that VS2010 SP1 has fixed. The problems I am talking about have to deal with the debugger (shows wrong values of variables, doesn't show anything, botches the call stack, etc, happens several times a day) and the IDE (slow, slow, slow, very slow, slows progressively the longer you use it, doesn't suggest any values during autocompletion, goes to a wrong place via Go To Definition, very slow again, etc, happens constantly). In the end, since, as I mentioned, some time ago we made a decision to use VS2008 as the IDE and we had some experience living like that for some time, we just couldn't help but notice that VS2010, even with SP1, is still *miles* behind VS2008 as a usable, everyday tool for a C++ developer. So... don't get your hopes high.

Marcello

2 Apr 2011 12:08 PM

I don’t think it’s reasonable to expect performance improvements as long as the IDE is based in managed code.

A product compiled in managed code will always be slower that its native code counterpart: Managed code will never have the performance of native code: Therefore, VS2010 will always look slower than VS2008/VS2005.

The managed code + WPF combination is a metaphor incapable of expressing in full the complexity of a system such as Visual Studio.