Re: Data byte count error

Ok, this might seem like a stupid question, but bears asking.The data you provided, was that the textual representation of the actual data?Or is that the exact format (text data with comma field separators)?

If that is the actual format, then you need to specify "delimited" instead of "formatted" for the format attribute in the DataConnector operator, and you need to specify a delimiter of ",".

Right now, the DC operator is defined to use "formatted", and formatted data is in binary format with a specific layout of:

2-byte record length, 'n' 'n' bytes of data end-of-record character (0x0a or 0x0d)

Re: Data byte count error

Also, forgot to mention that when using "delimited" as a record format for the Dc operator, all columns in the schema definition must be of type VARCHAR. We will send the VARCHAR fields to Teradata and Teradata will perform the proper conversion to the data types of the table.

I guess i now have a stupid question. The define schema statement describes the structure of the data to be processed. And the way i had my schema defined before was because of this. So the question is, would this be the right way to define the schema? How will this schema know that i have a Decimal and byteint data type?

Re: Data byte count error

The CLI 207 is not related to changes you made to the script. CLI 207 means CLI could not make a connection to the DBS (network is down, or something related to that; it is a catchall error code).

As for the schema:

Just like with FastLoad or MultiLoad or any other of our load utilities, the schema defines the layout of the data. It really has very little to do with the table definition (although one would hope they would match). If they do not match the DBS will do its best to convert from one format to another.

If you data is in text format (as is "delimited"), then you really have no choice but to provide a schema made up of all VARCHAR fields because, afterall, that is what the data is made of. The "formatted" record format is only for binary data.

If you wanted to do the same thing with FastLoad, you would have to create a DEFINE statement with all VARCHAR fields, and send the data to Teradata and Teradata will convert the VARCHAR data to the native data types according to the table definition.

The original error you received was because you indicated "formatted" and so the DataConnector operator looked at the first 2 bytes and interpreted them as the row length.

Get the connection issue resolved and then hopefully we can see progress with this job.

Re: Data byte count error

Hi Feinholz....

Thanks a lot..... i used the SQL Selector operator and its working perfectly fine ... :)

I have a request now, it will be of great help if you can provide any link or doc for the same :-

1. Any matrices, which provides a laylot of which operator is compactable for which Format type.....like in the script i showed above, i had to use the SQL Selector operator, when using Delimited format of the target flat file.

2. Any matrices, which provides a laylot of which TD utility has to be used in which case, like if the table is empty & we want to load a high volume of data, then fast loat is a better option as compare to other load utilty.