Problem with hbase classpath within s spark job (cloudera quickstart)

I'm having a simple spark job (done in java) that I'm executing on a cloudera quickstart plateform (learning phase) and I wanted to create a hbase table.

However hbase/zookeeper is not found:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/ZooKeeperConnectionException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:319)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ZooKeeperConnectionException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

I tried to set the HBASE_CLASSPATH from the /usr/lib/hbase/conf/hbase-env.sh to /usr/lib/hbase/ but it didn't worked.

It doesn't solve the problem. My HBASE configuration was in different directory other than /etc/hbase/conf but as you suggsted to append the SPARK_CLASSPATH with the HBASE configuration path. I did the same in my enviornment but it didn't work.

What I'm trying here - I've Spark2.3 R3 installed on Cloudera Platform and trying to connect HBASE which is running on different cluster. It's throwing an exception "hbase package is not a member of org.apache.hadoop package" while running Spark2.3

Can you please sugguest me how I pass the HBASE configuration/libraries to Spark2.3