How to inject local classpath of 3rd-party libs to Oozie Spark action?

We have a Spark app that uses CLAB Phoenix to access HBase tables. It is working on command line and I am trying to set it up as Oozie action. However, I am having trouble importing the class paths into Oozie using available Hue 3.9 GUI (CDH 5.7).

The previous related questions that I can find (such as this) as well as this blog post all suggest making physical copies of the library jars, and put them in HDFS (1) workflow/lib, or (2) Oozie sharelib dir. However, the Phoenix package has 70+ files (~210MB), and is already installed on the entire cluster. It seems inefficient and wasteful to upload all that into HDFS and swoosh them around the network unnecessarily.

With spark-submit, we can pass in the path using "spark.driver.extraClassPath" and "spark.executor.extraClassPath" . However, according to OOZIE-2277, it's not possible with Oozie. Setting them in <action><spark><configuration><property> just gets ignored:

The same log file shows that "spark.driver.extraClassPath" and "spark.executor.extraClassPath" are being populated, from what looks like Oozie sharelib contents. Is there a way to add to it through environment variable or something?

Really prefer not to mess with Oozie sharelib - it seems effectively considered a part of CDH installation. The blog didn't really explain how users should append 3rd-party content to it. And Phoenix is only used by a subset of workflows anyway.

Could the problem be with Hue-Oozie integration? Very confusing area - appreciate any tips anyone has!

This looks like OOZIE-2389. Using the workaround suggested therein, I was able to launch the Spark task, but org.apache.spark.deploy.SparkSubmit.main() failed immediately with no info.

I used phoenix-4.7.0-clabs-phoenix1.3.0-client.jar, not *thin-client.jar which doesn't contain the org.apache.phoenix.spark driver. Does it have any dependent jars that need to be copied along, or any version conflict with CDH 5.7.1?