Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

In our current prod environment, when the ETL launches, it first checks an environmental variable at the OS level which tells it where the "config" database and table is for it to get the necessary values/parameters for running.

In our test environment though we will have test and QA on the same physical box, but using separate instances of SQL Server. Feedback I am getting is that since they are on the same physical box, that means one env variable, so we would have to change the env variable depending on whether ETL was running in test or QA. This is less than desirable.

I understand not wanting to hard code anything into the packages but there has to be a way to accomplish such a thing with multiple environments on the same physical box. I can't justify splitting out onto separate physical hardware for each env.

I should mention, this is SQL Server 2008 R2.
–
mnDBAMay 16 '12 at 14:17

Do you have a DBA or administrative database that this information can be centrally stored in? That will alleviate your issue. You could also use registry keys, but that would get a bit trickier.
–
Thomas StringerMay 16 '12 at 14:34

We do, and that is what the env variable points to. The goal being, I believe, to not have to hard code that info about the location of that DB into the packages.
–
mnDBAMay 16 '12 at 14:36

Of course now that I think about it, is there really a difference between hard coding the connection info for that config DB in an env variable versus in the package? I am thinking no.
–
mnDBAMay 16 '12 at 14:39

Well I think there is a difference, as the package should be ignorant to the variable value. That, my guess, would be the benefit of using an env var as opposed to an SSIS variable. The package then needs modification for every implementation.
–
Thomas StringerMay 16 '12 at 14:43

5 Answers
5

We do this, the environment varaiable is at the user level, so the user for each environment that runs the SQL agent jobs is different. So on the the dev environment our agent user is something like SQLDev and on the QA environment it is something like SQLQA.

If you are not running from jobs (which I highly suggest doing except on dev while doing actual development), yes you have to change the enviroment variable to the correct one. We have created cmd line scripts to do that easily and placed them on our desktops, so that we can easily switch the environment variables.

1) Instead of a Env variable you could use an xml file (dtsconfig) to point to the SQL Server config database for each environment. If you use the built-in SSIS deployment utility you can manually override the connection string at deployment time. Not a great solution, but a solution.

2) If you go the scripting route you may not need to have the env variable at all. You could just override the connection string using /CONN

As mentioned by @SQLMD, I have also used a .dtsconfig file for this purpose. This has the added benefit of allowing you to modify the configuration without changing the SSIS package itself. The drawback, is that you'll need to specify the same location on each server to hold the configuration file (such as "d:\SSISConfigs\") in the SSIS package configuration.

One solution, albeit kinda sucky, is the convert the Agent jobs from SSIS tasks to CMDExec tasks and then call DTExec directly. If you do this you can set the environment variable in the batch file before the package is executed.

Wrap it with a batch file that sets the environment variables and then kicks off the job. This lets you set up the wrapper file on a per-environment basis and hold as many environments on the machine as you want.