Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

Update: You are correct in saying it's not the size of data or connection speed. It rang a bell in my foggy memory and I remembered where I had seen it: Slow in the Application, Fast in SSMS?(An Issue With Linked Servers). It's not parameter sniffing, it is the statistics themselves that are missing (due to permissions), causing a bad query plan to be used:

You can see that the estimates are different. When I ran as sysadmin,
the estimate was 1 row, which is a correct number, since there are no
Orders in Northwind where the order ID exceeds 20000. But when I ran
as a plain user, the estimate was 249 rows. We recognize this
particular number as 30 % of 830 orders, or the estimate for an
inequality operation when the optimizer has no information.
Previously, this was due to an unknown variable value, but in this
case there is no variable that can be unknown. No, it is the
statistics themselves that are missing.

As long as a query accesses only tables in the local server, the
optimizer can always access the statistics for all tables in the
query; there are no extra permission checks. But this is different
with tables on a linked server. When SQL Server accesses a linked
server, there is no secret protocol that is only used for inter-server
communication. No, instead SQL Server uses the standard OLE DB
interface for linked servers, be other SQL Server instances, Oracle,
text files or your home-brewed data source, and connects just like any
other user. Exactly how statistics is retrieved depends on the data
source and the OLE DB provider in question. In this case, the provider
is SQL Server Native Client which retrieves the statistics in two
steps. (You can see this by running Profiler against the remote
server). First the provider runs the procedure
sp_table_statistics2_rowset which returns information about which
column statistics there are, as well as their cardinality and their
density information. In the second step, the provider runs DBCC
SHOW_STATISTICS, a command that returns the full distribution
statistics. (We will look closer at this command later in this
article.) Here is the catch: to run DBCC SHOW_STATISTICS, you must be
member of the server role sysadmin or any of the database roles
db_owner or db_ddladmin.

And this is why I got different results. When running as sysadmin I
got the full distribution statistics which indicated that there are no
rows with order ID > 20000, and the estimate was one row. (Recall that
the optimizer never assumes zero rows from statistics.) But when
running as the plain user, DBCC SHOW_STATISTICS failed with a
permission error. This error was not propagated, but instead the
optimizer accepted that there were no statistics and used default
assumptions. Since it did get cardinality information, it learnt that
the remote table has 830 rows, whence the estimate of 249 rows.

Whenever you encounter a performance problem where a query that
includes access to a linked server is slow in the application, but it
runs fast when you test it from SSMS, you should always investigate if
insufficient permissions on the remote database could be the cause.
(Keep in mind that the access to the linked server may not be overt in
the query, but could be hidden in a view.) If you determine that
permissions on the remote database is the problem, what actions could
you take?

You can add the users to the role db_ddladmin, but since this gives them right to add and drop tables, this is not recommendable.

By default, when a users connect to a remote server they connect as themselves, but you can set up a login mapping with
sp_addlinkedsrvlogin, so that users map to a proxy account that has
membership in db_ddladmin. Note that this proxy account must be an SQL
login, so this is not an option if the remote server does not have SQL
authentication enabled. This solution too is somewhat dubious from a
security perspective, although its better the previous suggestion.

In some cases you can rewrite the query with OPENQUERY to force evaluation on the remote server. This can be particularly useful, if
the query includes several remote tables. (But it can also backfire,
because the optimizer now gets even less statistics information from
the remote server.)

You could of course use the full battery of hints and plan guides to get the plan you want.

Finally, you should ask yourself whether that linked-server access is needed. Maybe the databases could be on the same server? Could data
be replicated? Some other solution?

It returns about 140k records. but since it works just fine when the date value is hardcoded I can't think of a I/O or network issue that would affect the parameterized version so extremely. My gut says that the query is being passed to the remote server and the remote optimizer somehow chooses a bad query plan when it can't make sense of the parameter. But reindexing and purging the cache/buffers should correct that (I assume). I'll look into timeouts to see if we can make it at least return. Thanks
–
Bob ProbstJan 21 '12 at 2:50

Excellent answer and explained exactly the problem I was having, thanks. I would add that according to the MSDN, from SQL2012 SP1 onwards users with SELECT permission will be able to access DBCC SHOW_STATISTICS which will improve read only performance on linked servers without having to compromise security.
–
Steve PettiferMay 28 '14 at 8:22

I have two ideas that might help. I'm also going to tell you that I've had mixed to bad luck with performance running queries against linked servers. So my first recommendation is to avoid it if you can.

My first idea is to install the sproc onto the SQL Server 2000 box, having it reference the local server. You can then execute the sproc remotely.

exec linkedserver.database.dbo.Get_Accounts

If you can go this route, it should improve performance tremendously.

My second idea is to get at the estimated query plan when running the stored proc. Is it showing you what is taking so much time? One potential problem is that the account you are using on the linked server might not have enough authority to get at the table statistics (you need more authority for the linked server than you do for the local server). And that can make queries unbelievably slow. You can read more about that particular problem here.

I suspect in your case above SQL Server is just pulling the whole table from the remote server then running the query locally (I have seen this happen many times in the past). I prefer to be explicit (either by using OPENQUERY or by creating a SP on the remote server) so there is no chance of confusion.

As this is a ressource issue , the memory pool outside SQL server used to load external drivers and the CLR might be close to its limit. The default is 256MB. To get around this I suggest that you go to SQL server configuration manager, advanced tab and add the -g option to the end of the startup parameters.i.e ;-g1024 then restart the SQL Server service. I usually do this as we use a high number of linked server.
http://msdn.microsoft.com/en-us/library/ms190737.aspx