Updating Dynamics GP data directly in SQL

I would like to start by saying that I do not normally advocate updating Dynamics GP data directly in SQL Server. Most of the things you need to do should be handled either inside the Dynamics GP application or with integration tools like Integration Manager or eConnect. This is especially true of transactional data and anything to do with dates.

However there are some cases when you need to update master data that is isolated where the easiest way to do this is directly in the SQL table. Often this is due to the volume of data and the inability of tools like Integration Manager or eConnect to handle the task. Some obligatory warnings:

Always make a backup of your data before updating it

If you’re not 100% confident in what you’re doing, ask for help

Test first either in a test company, or on just 1 or 2 records if there is no test company available

If something you’re updating relies on other data that needs to be set up, verify all the necessary data exists (for example – if you’re updating Payment Terms for customers, make sure all the new Payment Terms are already set up in GP)

Now that we’ve gotten all the disclaimers and warnings out of the way, how do you actually do this? Something like a request to set the sales tax options for all inventory items to be ‘base on customers’ is pretty straightforward and I am not going to discuss that here. What is more complicated is the following scenario: update the Payment Terms for each customer based on a list in Excel. That’s a real life example that I’ve had to do more than once. You will have a list in Excel of each Customer ID and new Payment Terms that you want changed. How do you accomplish this?

One way to do this is to import the data from Excel into a new SQL table, then use that table in an update query. That’s certainly viable, but often creating new tables is a daunting process for users and sometimes importing data from Excel to SQL is not as easy as it looks. Also I am not crazy about cluttering up the database with this type of stuff, as often these tables are not deleted afterwards. My preferred approach is to create what I refer to as a ‘table on the fly’ inside of your SQL update query. Here is how I do this:

In your Excel file, create a new column that will turn the data into SQL code. For example, your Excel file will look something like this:

Add the following formula into column C (I put double quotes in red to separate them from the single quotes for visibility):

=“select ‘“&A2&“‘ customer, ‘“&B2&“‘ terms union“

Fill the formula down to all the rows of data in Excel, it should look like the following:

Now you can copy all the contents of that column into SQL, don’t forget to the remove the “union” from the last line. You can test it in SQL to make sure it will work:

As I mentioned before, you should validate your data to make sure that your new payment terms are already set up in GP. You can either do that inside the GP application itself, or check the SY03300 table, which is probably more efficient and will also check for any typos:

Good point, thanks. I would be very curious what the performance difference is with updating everything vs. checking what to update. I suspect for anything but huge data sets it won’t be anything substantial.

I would strongly advise you AGAINST adding inventory items directly in SQL. There is a lot of data validation that would need to happen when items are created. There are also a number of related tables that get updated, so this is better done either directly in the Dynamics GP user interface or with tools created specifically for this, like Integration Manager or eConnect.

Hi Victoria…. Im trying to add vendors in GP (V.9) but come up with the following error: “This record has been created since your attempt to create it. Changes wont be saved”… how would I get past this. I can add vendors in SQL but this is not ideal We want to be able to do this within GP…..

I have seen this happen when you are using Master Triggers from Professional Services Tools Library and have not set up the new company in there or you have other/custom triggers on your PM tables in SQL. Hope that helps.

Great post Victoria,
Users, heed the warnings Victoria gives, and especially the test company/environment. If you don’t have a test company available, STOP! and ask for help. Even the super SQL minions won’t run any script with update,delete, insert command without some form of testing first. I have SQL scripts that I have used for close to ten years that I still run in test company, just to make sure some weird has happened. It just takes seconds to confirm a SQL script’s results in a test company before you run the SQL script in production.

the result will tell you how many records will be updated. if the number of results is what you expect then highlight and run the query without the begin and rollback part. Just a confidence booster knowing how many records are going to be updated.

Thanks for the feedback. I am not a big fan of macros personally, have seen too many issues and have seen people spend a huge amount of time on troubleshooting them. Also have spent time helping fix data when they go wrong. 🙂

I use the table method for a customer who regularly updates their price lists (currency amount). The update requires changes to GP tables as well as a SalesPad custom price table. It works great – I just import the price template to a custom table in SQL, then run the scripts to update all the necessary tables in the company databases. What would take them hours or possibly days to update manually takes me about 15 minutes.