How could I change the sproc to defend against SQL Injections? If I use Parameterized queries and the user doesn't submit a value for one of the filters in the where clause I dont want that filter to be considered.

The good thing is that you don't have to do conditionals to build the parameter string. You can declare them all and just simply not pass in values for the one's that aren't used in your built-up @statement.

To my knowledge, fully parameterized TSQL is the ONLY WAY to guarantee protection from SQL Injection (at least without discarding valid input from the user such as CAST, EXEC, DECLARE, etc).

This has always been a sticking point for me, it always seemed to lead to "dirty" solutions. This is an approach I came up with to accommodate a flexible user interface. There are about 8 different fields in the original that can be selected from beyond the date range, and the user can select 0 - many items from each field.

This passes all of the selection into the proc in a table type parameter as name value pairs. If you needed to select two different departments for instance you would pass in two different records with the same parameter name, but different parameter values.

There haven't been any performance issues, but the primary table is only about 60,000 rows at this point. I do like that it allows for flexibility with a minimum of fuss, and no dynamic SQL.

-- Get start and end date, these are the only required params. DECLARE @startDate datetime = (SELECT CAST(ParameterValue as datetime) FROM @paramValues WHERE ParameterName = 'StartDate'); DECLARE @endDate datetime = (SELECT CAST(ParameterValue as datetime) FROM @paramValues WHERE ParameterName = 'EndDate');

Sorry MarbryHardin, but your solution is totally awful from a query optimization standpoint and you are guaranteed to get suboptimal performance. Read Gail's blog posts. Also, scale your table up to 6M or 600M rows and look at the query plans you get when you run your code. Also, watch how the plan will get reused with parameters that really need a different plan.

Oh, and your table variable isn't going to do you any favors either. I can give you a one column, one row table variable example that will give you a bad plan where the same in a temp table gives you the correct plan.

I know what you're saying, but I did a quick test loading 1.2 million rows to a test table which is about 10x what the production table should ever be based on usage and maintenance. It went from sub second to under 2 seconds, and that with unfavorably structured data. Not ideal, but there are some other optimizations that could be made.

And I will say that Gail's approach would be fine. If it fit. It does not allow for matching on an unknown number of values per field.

And you'll have to come saw my foot off to get me to write dynamic SQL.