SQL Server Performance Tuning for Stored Procedures

When a stored procedure is first executed (and it does not have the WITH RECOMPILE option), it is optimized and a query plan is compiled and cached in SQL Server’s buffer. If the same stored procedure is called again from the same connection, it will used the cached query plan instead of creating a new one, often saving time and boosting performance. This may or may not be what you want.

If the query in the stored procedure is exactly the same each time, and the query plan is the same each time, then this is a good thing. But if the query within the stored procedure is dynamic (for example, the WHERE clauses changes from one execution of the stored procedure to the next), then this may not be a good thing, as the query may not be optimized when it is run, and the performance of the query can suffer greatly. This can happen because changes in the query plan may occur, and if you run a cached query plan for what is essentially a new query, it may not be appropriate and it may cause performance to suffer greatly.

If you know that your query’s query plan will vary each time it is run from a stored procedure, you will want to add the WITH RECOMPILE option when you create the stored procedure. This will force the stored procedure to be re-compiled each time it is run, ensuring the query is optimized with the correct query plan each time it is run. Yes, this will circumvent the reuse of cached query plans, hurting performance a little, but it is more desirable than reusing incorrect query plans. [6.5, 7.0, 2000, 2005] Updated 12-6-2005

*****

Many stored procedures have the option to accept multiple parameters. This in and of itself is not a bad thing. But what can often cause problems is if the parameters are optional, and the number of parameters varies each time the stored procedure runs. There are two ways to handle this problem, the slow performance way and fast performance way.

If you want to save your development time, but don’t care about your application’s performance, you can write your stored procedure generically so that it doesn’t care how many parameters it gets. The problem with this method is that you may end up unnecessarily joining tables that don’t need to be joined based on the parameters submitted for any single execution of the stored procedure.

Another, much better performing way, although it will take you more time to code, is to include IF…ELSE logic in your stored procedure, and create separate queries for each possible combination of parameters that are to be submitted to the stored procedure. This way, you can be sure you query is as efficient as possible each time it runs. [6.5, 7.0, 2000, 2005] Updated 12-6-2005

*****

Here’s a way to handle the problem of not knowing what parameters your stored procedure might face. The problems are the query plans, the pre-compilation of stored procedures that SQL Server does for you. As you know, one of the biggest reasons to use stored procedures instead of ad-hoc queries is the performance gained by using them. The problem that is that SQL Server will only generate a query-plan for the path taken through your stored procedure when you first call it, not all possible paths.

Let me illustrate this with an example. Consider the following procedure (pre-compilation doesn’t really have a huge effect on the queries used here, but these are just for illustration purposes):

Suppose I make my first call to this procedure with the @query parameter set to 0. The query-plan that SQL Server will generate will be optimized for the first query (“SELECT * FROM authors”), because the path followed on the first call will result in that query being executed.

Now, if I next call the stored procedure with @query set to 1, the query plan that SQL Server has in memory will not be of any use in executing the second query, since the query-plan is optimized for the authors table, not the publishers table. Result: SQL Server will have to compile a new query plan, the one needed for the second query. If I next call the procedure with @query set to 0 again, the whole path will have to be followed from the start again, since only one query-plan will be kept in memory for each stored procedure. This will result in sub-optimal performance.

As it happens I have a solution, one that I’ve used a lot with success. It involves the creation of what I like to call a ‘delegator’. Consider again spTest. I propose to rewrite it like this:

The result of this restructuring will be that there will always be an optimized query-plan for spTestFromAuthors and spTestFromPublishers, since they only hold one query. The only one getting re-compiled over and over again is the delegator, but since this stored procedure doesn’t actually hold any queries, that won’t have a noticeable effect on execution time. Of course re-compiling a plan for a simple ‘SELECT *’ from a single table will not give you a noticeable delay either (in fact, the overhead of an extra stored procedure call may be bigger then the re-compilation of “SELECT * FROM AnyTable”), but as soon as the queries get bigger, this method certainly pays off.

The only downside to this method is that now you have to manage three stored procedures instead of one. This is not that much of a problem though as the different stored procedures can be considered one single ‘system’, so it would be logical to keep all of them together in the same script, which would be just as easy to edit as a single stored procedure would be. As far as security is concerned, this method shouldn’t give you any extra headaches either, as the delegator is the only stored procedure directly called by the client, this is the only one you need to manage permissions on. The rest will only be called by the delegator, which will always work as long as those stored procedures are owned by the same user as the delegator.

I’ve had large successes using this technique. Recently I developed a (partial full-text) search engine for our reports database, which resulted in a stored procedure that originally ran about 20 seconds. After employing the above technique, the stored procedure only took about 2 seconds to run, resulting in a ten-fold increase in performance! [6.5, 7.0, 2000, 2005] Contributed by Jeremy van Dijk.Updated 6-21-2004