This forum is now a read-only archive. All commenting, posting, registration services have been turned off. Those needing community support and/or wanting to ask questions should refer to the Tag/Forum map, and to http://spring.io/questions for a curated list of stackoverflow tags that Pivotal engineers, and the community, monitor.

HbaseConfigurationFactoryBean appears to be returning an instance of org.apache.hadoop.conf.Configuration instead of org.apache.hadoop.hbase.HBaseConfiguration. Looking at the source, it appears that this is because both member variables are of the hadoop.conf type.

This can be reproduced quite easily by using both <hdp:configuration /> and <hdp:hbase-configuration /> and specifying a bean with an @Autowired org.apache.hadoop.conf.Configuration property: context startup will fail due to the ambiguous autowire candidacy.

the namespace uses HBaseConfiguration.create() underneath (it also merges it with the given Hadoop config if any). Can you verify that the Hbase configs are the same before submitting the job?
My guess is that on the reducer side, the wrong configuration is used (or something happens to it).
How does your mapper/reducer job definition looks like?

Note that both Hbase and Hadoop share the same object type - Configuration. The XML wiring looks fine but I'm curious if context.getConfiguration() returns the proper object or not. You could double check by looking at the number of beans that match the same type and double check their names.

Comment

I have an example that was shipped in the release that I am using that like your test uses the template which is injected with the hbase-configuration and then injected into the test class. I haven't run it but assume that it works (spring being really good et al) but I wonder if it has something to do with how the jobs are being loaded...

I have checked to see if the correct Configuration is being used and the properties that I add in the hbase-configuration are available in the reducer.

Comment

Costin - I noticed that the hbasetemplate isn't in the m1 release. If there is a fix for the scriptlet problem I would like to move to the snapshot. Do you have any insight into why it is erroring out?

It looks like your script was declared in a context without any Hadoop configuration which was then wired to create other Hadoop components resulting in exceptions. I've pushed a fix which gives out warnings and does not bind the variables in this scenario.
Will be available in the next nightly build.