Dynamic SQL Performance

I have a query in an stored proc that takes a list of selected IDs from the front end, which acts like a filter. I use a varchar containing comma delimited IDs. There should never be more than 100 IDs in the list. If this filter feature is used in the app usually only 5-25 IDs selected.

Now my question: is it better practice to parse the string into a temp table then use the temp table in a query, or is it better to construct the query as a string and execute it dynamically. I am using SQL Server 2000 in development, but Production is still SQL Server 7.0.

Example:
--parse IDs into #userID using CHARINDEX in loop
select * from tbl_user where userID in (select * from #userIDs)

Then the testing is not fully compliant on the basis of the production system, being lots of advantages with SQL 2000 and superior than SQL 7. The usage of temp tables effectively or TABLE datatype is introduced in SQL 2000.

Have you checked the execution plan in SQL 7 for these queries, and update stats on relevant tables to get optimum performance.

If you opt for dynamic sql, you should use sp_executesql rather than EXEC to help the execution plan to be cached. As satya says you shouldnt develop on 2000 and then deploy to 7, lots of code could break once on the production server.

Difficult to say which method is better. I use temp tables sometimes for this method, mainly where there are lots of id's and the query uses those id's intensively.
But also be aware of potential concurrency problems if you intend to use temp tables and the proc is part of a transaction.