I am currently testing our upgrade process, for moving all of our databases from SQL Server 2012 instances to SQL Server 2016. The SSIS projects that use the ZappySys PowerPack tools are failing with a particularly unhelpful message. In addition, attempts to open those projects with SQL Server Data Tools 2010 fails – with an equally unhelpful message. This article records what I did to remedy this situation.

Initial Error Messages

Having installed SQL Server 2016 and copied the databases, logins, jobs etc. from our 2012 instance I then copied the SSIS packages, using the process I documented in this article. When testing the jobs on the new 2016 instance that used these SSIS packages, the job would fail.
The integration Services Catalog (ISC) report has a series of error messages – the first being:ZS Advanced File System Task:Error: Failed to load task “ZS Advanced File System Task”, type “”. The contact information for this task is “Advanced File System Task; ZappySys LLC (c) All Rights Reserved; SSIS PowerPack; http://www.zappysys.com/support&#8221;.
As that error message didn’t tell me anything useful, I attempted to import the project into Visual Studio Data Tools 2010 using the process previously documented here. When the tool attempted to import the package it failed with the following message:

Picture 1: Error message from VS 2010

Failure to open Project

Because the error message from ISC mentioned the ZappySys PowerPack I then contacted ZappySys, who responded quickly and suggested upgrading to the latest version of the software. This was a painless experience but changed nothing – the errors were the same.

Installing Data Tools

Searching for related articles found this by Koen Verbeeck (t|b) and details the versions of SQL Server Developer Tools required for various versions of SQL Server.

When I tried one of these links within Internet Explorer a security alert appeared, which I assumed was blocking the installation process:

Picture 2: IE message

IE message box

Checking the message box to prevent this message appearing again allowed the installation to complete.

Using SQL Data Tools 2015

With the correct version of Data Tools installed for SQL Server 2016 I can now open the SSIS project, although there are still errors relating to the ZappySys controls:

Picture 3: Data Tools Error Message

At least I can edit the project

Removing the ‘corrupt’ ZappySys controls and adding them back in corrects these issues, as well as creating a ZappySys connection which had disappeared completely.

Picture 5: The re-inserted control

Controls back in place

Another way of Copying the Project Between Instances

Having now installed the appropriate Data Tools I have discovered that if I import the package into Data Tools 2015, from the SQL Server 2012 instance and then deploy it to SQL Server 2016 (using Data Tools) that the project will also be fine – without any of the issues I had experienced.

Conclusion

Deploying via ISC works perfectly well if the version of SQL Server involved are all the same. However, there appears to be issues for some projects when deploying with different versions. Using the correct version of Data Tools (for the higher version of SQL) appears to cater for this.

In the past I’ve used the ‘add existing package’ option, when opening an existing SSIS package within SQL Server Data Tools (SSDT). That was before the appearance of the SSISDB and the Integration Services Catalog. This option doesn’t work, when your dtsx package is within the SSISDB and the solution for this is quite simple – just not where I had been looking.

Working on the principal that there are probably other people out there that have missed the obvious (as I had), this is a short article describing the steps required to do this.

Example

When you need to import a project from the SSISDB into SSDT there is now an option ‘Integration Services Project import Wizard’ within the ‘New Project’ option. This can import from an ‘ispac’ file or directly from a specified Integration Services Catalog:

The initial Import dialog

Select this option and change the name, solution and location details at the foot of the screen (if you need to). Then press OK and a screen explains what this dialog will do. Press ‘Next’and the subsequent screen will ask for either the location of an ispac file, or the server that you want to import the package from. In this case select ‘Integration Services Catalog’, select the server and then the location of the required package from within the catalog:

Provide the Server name and Path of the package

Click ‘Next’ and after a brief wait a screen will provide a summary of the action it will take.

Summary of actions to be taken

Click on the ‘Import’ button and the specified package will now be imported into a new project:

The SSISDB package Imported

The project can now be edited and deployed back to the Integration Services Catalog when required.

]]>https://sqlrambling.net/2017/09/15/ssisdb-and-catalog-part-5-importing-a-package-into-sql-server-data-tools/feed/0stevehall234SSIS – Importing CSV Files that have different number of columnshttps://sqlrambling.net/2017/07/28/ssis-importing-csv-files-that-have-different-number-of-columns/
https://sqlrambling.net/2017/07/28/ssis-importing-csv-files-that-have-different-number-of-columns/#respondFri, 28 Jul 2017 16:03:31 +0000http://sqlrambling.net/?p=1170Continue reading]]>Background
Recently I had to write an SSIS package that would import a series of csv files that had different numbers of columns within them. One file might have 15 columns, another 17 and another have something different again.

To add to the fun the first nine columns of all of these files are ‘fixed’ – we know what each column is for, as well as the last three. Yes, the last three columns have a known purpose but they could be columns 13, 14 and 15 or 15, 16 and 17 or suchlike, depending upon how many columns the file has.

Figure 1: CSV File with 14 columns

Figure 2: CSV file with 12 columns

In the following article I’m going to describe the process I went through to solve this problem. The source code, data and SSIS package used in this article (obviously, not the actual project and data I have used at my employ) will be included at the foot of the article.

Initial Investigations
Of course, this isn’t the first time somebody has had to import a file with an unknown number of columns, so a little time with search engines found a handful of people that had worked out their own solutions.
There appeared to be two types of solution to this task – BIML and C# code to edit the project.

BIML – I have never used BIML but from the examples I saw it looked like it could be used to create an SSIS project based upon actions taken when opening the first csv file. This would work if all of the files had the same number of columns but they don’t. So if the first file processed had 15 columns then it would create a project to process 15 columns. If the next file opened had 16 columns then it wouldn’t process that file correctly.

C# – Looking at the examples I found of the C# solution could have been worked to provide a solution I needed but with a certain amount of complexity I would rather avoid. It has been a few years since I’ve written any meaningful C# code and if I’m going to struggle with it what about any DBA that should follow in my footsteps (or myself in a few months’ time)?

Now what?

So, if I can’t (or won’t) use these two solutions, what can I do?
Several years ago I worked with the Technical Director of a company, one Joe Gylanders. When database theory was surfacing he was writing his own database engines, to understand and implement the concepts and had a way of simplifying issues that I was quite envious of (and entertained by). Just never ask him for documentation.
Once, when working with him to import some particularly horrendous customer data into a new system I had to admit defeat with the import process I was attempting to hammer into a decent shape. He asked me what I wanted to do with this data and I promptly told him what columns were related to product details, what columns should be related to sales, what columns should be related to location and why certain inconsistencies were making it so very difficult to work out what data needed to be split to which table, from within the import process.
His reply (paraphrased somewhat) was “No, you don’t need any of that. You need to get that data into the database. Once it’s in there we can slice and dice it any way we want. The priority is to get the data into the database.”
So, with that in mind – what do I really want? If I get each record from the csv file into a database even as one column, can I use that?
Of course I can.
What I would have would be one column with data in it, separated by commas. Could we split that data based on that information?
My regular prowling of the SQL Server Central website has meant that for quite some time I’ve been aware of the ‘DelimitedSplit8K’ table-valued function, created and updated by Jeff Moden and many others. Given a string and the delimiter within that string it returns a table containing each of delimited values as well as a number to show which element it is from within that string. Therefore, not only can I split the data into the individual elements, I can also identify the first nine entries and the last three – the ‘fixed’ columns.

Creating the Project
Create the test database and a ‘staging’ table that the data will initially be stored into. This first table will have one column that contains the entire csv row and another column that contains the name of the file it was extracted from (just because I like to do such things – I can backtrack to the original data if I need to).

Now we create a second staging table which will take the divided data.
Note that I’ve included additional columns to convert some columns from their character format to the required format for the final table. ‘SurveyDate’ will be a varchar column at this point but I want it as a date column. I could have just converted it ‘in flight’ but I prefer to have a ‘before and after’ picture of the transformation. That way, if there any issues I can see the original data in the staging table and what it was transformed to. I’ve done a similar thing for ‘UploadedAt’, ‘Rating1’, ‘Rating2’ and ‘Rating3’.
Because I only know what the first nine and last three columns will consistently be, I have catered for up to 12 additional columns between these two groupings, with ‘VariableCol1’ through to ‘VarialeCol12’.

There are four phases to this project:
1. Get the data into a single-column table,
2. Split the data from the single-column table where individual columns can be stored,
3. Convert any data to more appropriate types,
4. Move the parsed and cleansed data to the final table.

To perform this there needs to be an SSIS package that can import the initial data, called the stored procedures that manipulate the data and then move it to the final table.

Figure 3: SSIS Package

The ‘real’ work is executed within the process ‘Parse survey data’. This executes a stored procedure ‘usp_ParseSurveyData’, which uses ‘DelimetedSplit8K’ and ROW_NUMBER to extract the individual ‘fields’ from the CSV data, along with a number to identify which element they were from within that CSV data:

From the example shown above I now know that there are 14 elements within this row. The first 9 and 12, 13 and 14 are of known data and elements 10 and 11 are unknown data, which still needs to be stored in the final table.

Figure 5: Row stored in final table (NULL columns not displayed)

Conclusion
Personally, I like to make SSIS projects as simple as I can – with the more complex processing within the database, where I can utilise T-SQL fully. This project is an example of that, where SSIS was used to move data around and call stored procedures for the heavier work.
There are probably people out there who solved this issue in a very similar way but I couldn’t find any articles. Therefore I have produced this in case anybody else out there had the same issues.

With the advent of the Integration Services Catalog, parameters for SSIS packages no longer need to be controlled using a configuration file (usually with the extension of ‘.dtsconfig’). Now the parameters can be stored within the Integration Services Catalog (and therefore, within the SSISDB database) and altered far more easily within an agent job.

This example uses SQL Server 2012.

Example

As a demonstration, create a simple SSIS package that takes a flat file source and imports it into a database table.

Basic SSIS Package

I’ve created two connections – ‘Input csv file’:

Flat File Connection

And ‘Target Table’:

DB Connection

Nothing exciting in those. ‘Input csv file’ provides the details of the csv file I’m using for this simple test and ‘Target Table’ provides the details for table that will receive the csv file’s contents.

Of course, these connections are effectively hard-coded. Unless a method is implemented to change these, then it will always have to be the same input file, in the same location, copying data to the same table on the same database within the same server.

Older versions of SSIS used to make use of an XML configuration file, which would be associated with the SSIS package within the agent job that controlled it. This worked quite well but did mean that there was a separate document that needed to be associated with the SSIS package. If it was inaccessible, lost or corrupted then this would cause issues with the package execution.
Now the settings are stored within the SSIS package.

Creating Parameters

Within Data Tools there is now a tab entitled ‘Parameters’. Go to this, click on the ‘Add parameter’ icon for each new parameter required and create your parameters.

Parameter tab within SSIS Project

In this example I have created three – the full pathname of the input csv file (called ‘Source_File’), the target database (‘Target_DB’) and the name of the server that the database is within (‘Target_Server’). In this case all of the parameters are of type ‘string’. The value is the value I want to use at this time. The ‘sensitive’ column controls whether the parameter will be encrypted within the catalog and ‘Required’ dictates whether the parameter must be supplied or not. The ‘Description’ column is used to add a description that should prove helpful to anybody that needs to understand the purpose of this parameter.

Associate parameters to controls

Now that the parameters have been created they need to associated with the required controls. Within the Connection Manger, right-click on the required connection and select ‘Parameterize…’.

Parameterize Wizard

This produces a wizard, where you select the property that you wish to set via a parameter. Then you have the option to ‘Use existing parameter’ – select this and the drop-down list will show all of the parameters created earlier in the ‘Creating Parameters’ section of this article. Select the required parameter and press ‘OK’.

Setting the ServerName property

In this example above, I have associated the parameter ‘Target_Server’ with the property ‘ServerName’ of the ‘Target Table’ connection.

Setting the InitialCatalog Property

And I have also associated the parameter ‘Target_DB’ with the ‘InitialCatalog’ property of the ‘Target Table’ connection.
Notice that a symbol appears alongside the connection, signifying that there are parameters in use within this.

Setting the ConnectionString Property

Setting the parameter ‘Source_File’ as the parameter for the property ‘ConnectionString’ of the connection ‘input csv file’ means that I now have parameters in use for the source file, target server and target database.

Testing within the editor

Now go back to the ‘Parameters’ tab and change one of the values.

Changing a Parameter

In this case I’ve changed the ‘Target_Server’ value to a new value. Having altered the parameter you must now save the project for this change to take effect.
Having saved the project you will find that the value within the connection that uses the ‘Target_Server’ has now changed.

Showing the Changed Value

Changing Parameters for a Deployed Package

Once a project is deployed to the integration Services Catalog the parameters can be accessed by right-clicking on the project (or .dtsx name) name and selecting ‘Configure’. The tab ‘Parameters’ gives you the ability to set the parameters that were created earlier, by clicking on the ellipse at the end of each parameter line.

Setting Parameters via ISC

The parameters can also be set within the agent job that will be created to execute this package, via the ‘Configuration’ and then ‘Parameters’ tab of the associated step.

Parameters via Agent Job

Editing the Value

Final Parameter Settings

Now I can change the parameter values within the agent job, instead of relying upon a separate dtsx file. This also means that when I move a project from one server to another it is a far simpler task to alter the parameters, as they are contained within the project.

When using the Integration Services Catalogs, moving an SSIS package (dtsx) from one server to another is a fairly straightforward task.

Test Environment

In the following example both SQL2014_INST1 (Instance One) and SQL2014_INST3 (Instance Three) have the required Integration Services Catalog installed. However, Instance One has an SSIS package that is not present on Instance Three.

Creating an export file

The project can be exported to a file by selecting the project, right-click and select ‘Export…’. This requires you to specify a location and filename for the resultant ‘ispac’ file – the file type used for project deployments within Integration Services Catalog.

However, if the target server (Instance Three) can see the source server (Instance One) then it is also possible to retrieve the project directly from the SSISDB of the source server, so the export might not be required.

Preparing the target environment

At this point Instance Three has no folders created within the Integration Services Catlog and will require one before the project can be imported. The name doesn’t have to be the same as the name of the folder used on Instance One. Right-click on the ‘Integration Services Catalog/SSISDB’ and select ‘Create Folder…’.

Importing the project to the target

Now that the folder exists it is possible to import the package – on Instance Three, right-click on the ‘Projects’ folder and select ‘Deploy Project…’.

The wizard requires either the location of the ispac file, created in the earlier steps, or the server with the Integration Services Catalog that has the required copy of the project.

Either:

Or:

After the source has been selected the next page of the wizard requests the target location.

Once the target has been selected a summary screen is displayed, whereupon you can press the ‘Deploy’ button.
If all is well, a screen will display the result as a series of ‘Passed’ remarks.

The project now also exists on the target server, Instance Three.

]]>https://sqlrambling.net/2016/12/01/ssisdb-and-catalog-part-3-copying-a-package-between-servers/feed/0old dogstevehall234exportingssisdb_01exportingssisdb_03exportingssisdb_04exportingssisdb_06exportingssisdb_07exportingssisdb_08exportingssisdb_09exportingssisdb_10exportingssisdb_11SSISDB Error “The version number of the package is not valid”https://sqlrambling.net/2016/09/23/ssisdb-error-the-version-number-of-the-package-is-not-valid/
https://sqlrambling.net/2016/09/23/ssisdb-error-the-version-number-of-the-package-is-not-valid/#respondFri, 23 Sep 2016 12:39:22 +0000http://sqlrambling.net/?p=936Continue reading]]>The full message will also contain “The version number cannot be greater than the current version number”.

Behaviour

I have an SSIS package installed in an SSISDB which is on a server that run SQL Server 2012 (server A) and it needed to be copied to another 2012 instance (server B). My workstation has SQL Server 2014 and I connect to the 2012 instance on the target server using my workstation instance.

I connected from my workstation to Server A and exported the SSIS package into the required ‘ispac’ file. Then changed my connection to point to Server B and imported the ispac file. So for both the export and import I was using my SQL Server 2014 to connect to SQL Server 2012 instances.

The export and import run without any issues, but when the associated job executed that ran the SSIS package on Server B, it failed. In this case the important part of the error report was:“Package migration from version 8 to version 6 failed with error 0xC001700a. the version number in the package is not valid. The version number cannot be greater than the current version number.”

Cause

Exporting the SSIS package by using SQL Server 2014 (even though it is connecting to a 2012 instance) appears to have changed the ‘PackageFormatVersion’ property within the dtsx file, form ‘6’ to ‘8’. You can see this by opening the dtsx file in an editor such as Notepad++. Editing that value back to ‘6’ merely corrupts the package, so there are probably further changes made within the file.

Solution

Connect to the required instance using SQL Server Management Studio (SSMS) that matches that version in order to complete this export/import. In my case I could remote onto the required servers and use the SSMS for SQL Server 2012 that was installed there. The export and import worked as expected and this time the SSIS package ran without issue.

]]>https://sqlrambling.net/2016/09/23/ssisdb-error-the-version-number-of-the-package-is-not-valid/feed/0stevehall234SSISDB and Catalog – Part 2 – Adding the SSIS Packagehttps://sqlrambling.net/2016/09/13/ssisdb-and-catalog-part-2-adding-the-ssis-package/
https://sqlrambling.net/2016/09/13/ssisdb-and-catalog-part-2-adding-the-ssis-package/#commentsTue, 13 Sep 2016 11:04:30 +0000http://sqlrambling.net/?p=860Continue reading]]>In the previous article I covered the creation of the SSISDB – a new feature within SQL Server 2012 that is used to store all of the required information for an SSIS package.

This article will cover the creation of a basic SSIS package that will require parameters to move data from a table on a database from one server to another. This package will be stored in the SSISDB and a job created to execute it.

Creating the Test Project

To create the test database I merely used the ‘New Database…’ wizard and left the defaults alone. I have a database named ‘TestDB_INST1’ on the server ‘SQL2014_INST1’ and a database ‘TestDB_INST2’ on the server ‘SQL2014_INST2’.

Within each database is a very basic table that the SSIS package will use to copy from one database to the other. The details of the table are unimportant – it the creation of the SSIS package, importing it into the SSISDB and the job creation that I’m concerned with here.

So, creating an OLEDB source and destination with connections for the source and target databases:

I’ve also added a TRUNCATE TABLE task for the target, so when I switch servers around I’ll have an empty target table every time:

In the past I’ve used dtsconfig files to control the parameters for an SSIS package. Because I haven’t yet worked out how the later 2015 Data Tools do that, I’m still going to create the dtsconfig file. this is done by going through the menu – ‘SSIS’/’Package Configurations….’ and selecting ‘Enable Package Configurations’. This executes the all-too-familiar wizard for selecting which properties you want to control via an external file (in this case I’ve left it to the default of ‘SSISConfig.dtsConfig’). The properties I have selected here are ‘InitialCatalog’ and ‘ServerName’ for both of the connections, allowing me to specify the sql server instance and database for each connection.

An HTML file is created with these properties and in the past I would create a job to execute the SSIS package and set an entry in the ‘Configurations’ tab of the job to list the dtsConfig file. Now, the import wizard for the SSISDB will use this file to create parameters in the format that it requires.

Importing the Test Project into SSISDB

To import the SSIS package into the SSISDB, within SSMS expand the ‘Integration Services Catalog’ tree to the ‘Projects’ level. Right-click on the ‘Projects’ node and select ‘Import Packages…’, which will launch the wizard for this task.

The first screen wants to locate the SSIS package that you intend to import. In my case I saved it in the C:\Temp folder beforehand, to save any long-winded hunting.

The next page locates the dtsx file and you can confirm which one to import (if your project has several).

Then it requires a location to place the files it generates for the import, a name for the project, protection level and a description (the description isn’t mandatory).

The next page (‘Update Execute Package Task’) is empty in this example and the page after that locates the dtsConfig file for this dtsx package.

Having located the dtsConfig file it then shows the parameters that it has extracted from that and allows you to select which you require.

Following that is a screen that gives you the opportunity to change the values for the specified parameters. These can be change later in a variety of ways, so don’t worry that this is your one chance to set them.

A summary screen follows and then a progress screen. Once completed it has created the files required to import into the SSISDB, which it then continues to do with the second stage of this wizard.

It now processes the import file, from the location you specified earlier.

Specify the server you want to import this package into and the path within the SSISDB for that server.

Another summary screen appears before the import is complete. Now you can see the package detailed in the required location under ‘Integration Services Catalog’.

At either the ‘Copy Table’ node or ‘Package.dtsx’ I can right-click and select ‘Configure…’ to change the values for the parameters for this package. Note that the ‘Scope:’ entry on the wizard default changes, depending upon which node I clicked on.

However, you can also change the parameters from within the Agent job.

When creating a job to execute this package I need to specify ‘SSIS Catalog’ for the Package Source.

Whereupon I can also change the parameters for the next execution of this job via the ‘Configuartion’ and then ‘Parameters’ tab.

Conclusion

Although a bit more work then the previous method of storing SSIS packages, the SSISDB uses a wizard to store everything required within one database. Upon first inspection I quite like it, but there isn’t a massive amount of helpful documentation on the inner workings of this.

In following articles I’ll demonstrate how to copy the entire SSISDB to another server, how to move individual SSIS packages between servers and how to use T-SQL to extract information about the packages from the SSISDB.

]]>https://sqlrambling.net/2016/09/13/ssisdb-and-catalog-part-2-adding-the-ssis-package/feed/1dog_horsestevehall234ssisdb_06ssisdb_08ssisdb_09ssisdb_10ssisdb_11ssisdb_12ssisdb_14ssisdb_15ssisdb_16ssisdb_19ssisdb_20ssisdb_21ssisdb_23ssisdb_24ssisdb_25ssisdb_25ssisdb_27SSISDB and Catalog – Part 1 – Creating the DBhttps://sqlrambling.net/2016/09/07/ssisdb-and-catalog-part-1-creating-the-db/
https://sqlrambling.net/2016/09/07/ssisdb-and-catalog-part-1-creating-the-db/#respondWed, 07 Sep 2016 16:44:45 +0000http://sqlrambling.net/?p=846Continue reading]]>Although introduced in SQL 2012 I’ve never encountered this feature before, as all sites where I have previously worked have started with much earlier versions, and the old habits of implementing SSIS packages and jobs have just continued.

Now I am working for a company that is starting from afresh – no previous installations to worry about and have started with SQL 2012.

Prior to 2012 SSIS packages were stored in the MSDB of an Integration Services server. It was also (and still is) possible of course to just run the dtsx files located in a local folder but I was never a fan of this – it could get messy.

From SQL 2012 a new feature appears within SSMS – ‘Integration Services Catalogs’. Within that is the SSISDB and within that is a folder structure (specified by yourself) and the SSIS packages. All information required to execute the SSIS package is stored within the related database – connection details, passwords, parameters etc., all encrypted for security.

Creating SSISDB

Initially the entry ‘Integration Services Catalog’ has no entries:

No SSISDB at this point

Right-click on the ‘Integration Services Catalog’ and select ‘Create Catalog…’, this will start the wizard for this process (it is possible to perform all of this with PowerShell, but that is beyond my abilities – or desires).

The Catalog Wizard

Because the IS Catalog relies upon CLR you must check the ‘Enable CLR Integration’ checkbox. It is unchecked by default, which makes no sense unless MS just want to draw your attention to the fact it requires enabling CLR.
The option ‘Enable automatic execution of Integration services stored procedure at SQL startup’ is not mandatory but it is recommended. The purpose of this option is to run a stored procedure that corrects the status of any packages that were running should the server shut down at the time.

A password is also required, which protects the database master key used to encrypting the data within the catalog. Keep this password safe and as an extra precaution backup the database Master Key. Once the wizard has completed you can backup the Master Key with the code below (change the file location to suit your requirements):

On a Dev server I have used Visual Studio 2010 to build and test an SSIS package. Having completed all of my testing I have now created an Agent task with the SQL Server instance on the same box, that should execute the dtsx package.

However, when it attempts to execute it fails and I see the error message:

“To run a SSIS package outside of SQL Server Data Tools you must install Move file to Archive of Integration Services or higher.”

In this case ‘Move file to Archive’ is the name of a task within the SSIS package, which is of course, moving a file to an archive folder.

Not the most helpful error message.

Investigations

Running the installation of SQL server (or executing a Service Pack installation), up to the point where it shows what is already installed shows that the Integrated Services shared components are not installed:

On this machine, although SQL server is installed, the shared components required for Integration Services are not. In this situation it is still possible to run very simple SSIS packages, because ‘dtsexec.exe’ is installed. This explains why I can see an Agent job that runs a package for running backups (created by the Maintenance Plan Wizard) but cannot execute my package.

Solution

Rerunning the installation confirms the absence of Integration Services components:

I installed the required components and then confirmed their presence afterwards:

Now the SSIS package executes successfully.

]]>https://sqlrambling.net/2016/07/25/ssis-error-to-run-a-ssis-package-outside-of-sql-server-data-tools-you-must-install/feed/0cute-but-not-helpfulstevehall234SSIS_01SSIS_02SSIS_03SSIS Error Code “0x80070057” in Slowly Changing Dimensionhttps://sqlrambling.net/2014/08/28/ssis-error-code-0x80070057-in-slowly-changing-dimension/
https://sqlrambling.net/2014/08/28/ssis-error-code-0x80070057-in-slowly-changing-dimension/#respondThu, 28 Aug 2014 13:56:55 +0000http://sqlrambling.net/?p=184Continue reading]]>SSIS can produce an inordinate number of error messages and numbers. Some of these can even be helpful and informative.
Generally accompanied by an Error Code, one of the most popular appears to be “0x80070057” and can have a number of causes.

Recently I came across this particularly helpful code when creating a Slowly Changing Dimension:

Error: 0xC0202009 at ‘Task Name’ , Slowly Changing Dimension [14907]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80070057.
An OLE DB record is available. Source: “Microsoft SQL Server Native Client 10.0” Hresult: 0x80070057 Description: “The parameter is incorrect.”.
Error: 0xC0047022 at ‘Task Name’ , SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component “Slowly Changing Dimension” (14907) failed with error code 0xC0202009 while processing input “Slowly Changing Dimension Input” (14918). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at ‘Task Name’ , ‘View Name’ [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at , SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component “’View Name’” (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

The behaviour was thus – run it for the first time and the process extracted data from a View and inserted all of the rows as new rows into the output. Run it a second time, when it would check for changes (because the output table now had data) and it would instantly fail with the above error message.

The first thing I noticed was the “The parameter is incorrect”, which is strange because the Wizard created everything based on the details I fed into it – so it shouldn’t be anything I’ve done.
So, as I had created 5 other Slowly Changing Dimensions for other Views and Tables I decided to recreate this one from the start with a very ‘narrow’ table. The table concerned with this particular process was rather wide, certainly when compared to the others that had worked successfully.
A couple of fairly tedious hours later I had a failure when I added the last few columns back in (as ever, it’s always something at the end). These columns were varchar(8000) and there were six of them.
This is where the line in the error message “The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020” started to look more interesting.
Checking these columns showed that the 8000 size was a tad generous, based upon the data likely to be stored within them. Reducing these to 300 each removed the error and everything worked as expected.

I have no idea what size buffer SSIS can create for this processing but the message implied that it wasn’t large enough for this particular table. Reducing the width of the table corrected this.
It may not be the solution for every instance of that message but in this case it worked and can be added to the list of possible solutions to one less-than-helpful SSIS Error Code.