JDBC Query Consumer - Mapping to Hive Metadata

I have 3 JDBC origins writing to the SDC RPC destination(same destination). Two of them are JDBC consumers and another one is JDBC multi-table consumer.

On the receiving side, i have an SDC RPC origin connected to the hive/hdfs and so on. My question is i'm using this ${record:attribute('jdbc.tables')} in the hive metadata at table name and its working fine for the data coming from Multitable consumer origin but its failing to feed the data coming from JDBC consumer. What should I add to the JDBC consumer so that hive can accept the data in this format ${record:attribute('jdbc.tables')} in the receiving side.

1 Answer

When using the query consumer, there is the option for Create JDBC Header Attributes, which can, in some cases, populate jdbc.tables similarly to in JDBC multi-table origin. There are various reasons why this might not work (ex: your query itself doesn't query a single table, the driver doesn't support population of this metadata, etc.). You can do a preview, with Show Record/Field Header option checked, to see if this is being populated.

If this isn't working as expected, and you know the table name that you want to appear in the attribute, you can simply add a separate expression evaluator processor after your origin to populate this. Specifically, use an entry in the Header Attribute Expressions section. You can simply reuse the jdbc.tables name and set it yourself, or choose a different attribute name. Either way, this should be specified under the Header Attribute column, and you can then use that attribute name in your destination.

Comments

I don't want to change the expression in the hive. Can I know what exactly I have to do in the expression evaluator? and can i leave the JDBC header prefix as `jdbc.` and by ticking the header attributes?