Conversion failed when converting from a character string to uniqueidentifier

com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting from a character string to uniqueidentifier

Hi Folks,

I'm attempting to use sqoop2 from Hue to import a single column from one database table in MS SQL Server. That column is defined as 'nvarchar(max)', and contains a json document - which starts and ends with curly braces {...}.

The sqoop2 job fails with: com.microsoft.sqlserver.jdbc.SQLServerException: Conversion failed when converting from a character string to uniqueidentifier

I suspect it's trying to convert the string value to a UID simply because it starts and ends with curly braces, buy hey, I'm just guessing.

I see in the sqoop (version 1) docs that you can override the default conversions using:

Table�3.�Parameters for overriding mapping

Argument Description

--map-column-java <mapping>

Override mapping from SQL to Java type for configured columns.

--map-column-hive <mapping>

Override mapping from SQL to Hive type for configured columns.

How can I use these overrides in sqoop2 via Hue? Or another way? Or am I off on the wrong path here?

Re: Conversion failed when converting from a character string to uniqueidentifier

Could you try adding a column to partition by (that isn't of type uniqueidentifier)? If no partition column is provided, Sqoop will automatically split by the primary key, which is AssetId in this case. You should be able to split by text and date types worst case.

Re: Conversion failed when converting from a character string to uniqueidentifier

Could you try adding a column to partition by (that isn't of type uniqueidentifier)? If no partition column is provided, Sqoop will automatically split by the primary key, which is AssetId in this case. You should be able to split by text and date types worst case.

Re: Conversion failed when converting from a character string to uniqueidentifier

It created a bunch of files, named part-m-00001, part-m-00002, etc. What I really want is a single file, so I can easily create a table from it. Of course I can hack the thing into one file with the hadoop fs -cat command, but is there an easier way to import the data sqoop2 retrieved?

Re: Conversion failed when converting from a character string to uniqueidentifier

Glad that worked. I'm not really sure if Sqoop2 has a feature to do that. You might be able to reduce the number of mappers down to 1 using throttlers. When using Hue or the Sqoop2 client, the number of throttlers during the extraction phase and load phase can be numerically defined. If left blank or 0 is provided, then the execution framework may ignore it. If you throttle the number of extractors to 1, then you'll have 1 mapper perform the extraction. If you throttle the number of loaders to 1, you'll have at most 1 loader write to HDFS.

If that doesn't work, your best bet is performing the concatenation yourself as you've described.

Re: Conversion failed when converting from a character string to uniqueidentifier

Thanks Abe. I assume I'd have to do this from the sqoop2 command line? I don't see any mapper or loader options in Hue.

Another minor snag: The text files sqoop2 created contain the correct data, but strings are single-quoted, and when I import the file(s) the single quote becomes part of the data. How would I get around that?

Re: Conversion failed when converting from a character string to uniqueidentifier

Yup you're right - sorry, I was looking at the table creation pages, not the sqoop import pages.

The SQOOP-777 issue seems to reference column and record limiters. Is my single-quote problem one of these? I would think not, since a single quote is being inserted at the beginning AND the end of the column value. It's behaving like it's enclosing the column value in single quotes, rather than a field or record delimiter.