Where do i update HADOOP_USER_CLASSPATH_FIRST=true and HADOOP_CLASSPATH in cloudera manager so that it has cluster wide effect.

I manually changed the seeting in /etc/hadoop/conf/hadoop-env.sh. However, i dont see that effect. Basically, i want to prepend classpath when starting hiveserver2 and metastore. This works fine on HDP when these parameters are set in hadoop-env.sh via ambari.

P.s. Typically a bad idea to override system jar versions, as thesoftware's been compiled with the version that was included along. Anyincompatible changes to methods or availability of classes within thechanged version you're forcing the classloader to pick would cause runtimefailures (NoSuchMethodError, NoClassDefFound, ClassNotFoundException, etc.).P.p.s. If overriding some pre-included library is the prime purpose, try touse Maven shading with namespace relocation:https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html

P.s. Typically a bad idea to override system jar versions, as thesoftware's been compiled with the version that was included along. Anyincompatible changes to methods or availability of classes within thechanged version you're forcing the classloader to pick would cause runtimefailures (NoSuchMethodError, NoClassDefFound, ClassNotFoundException, etc.).P.p.s. If overriding some pre-included library is the prime purpose, try touse Maven shading with namespace relocation:https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html

I tried using below setting but it is not working. The issue is that I want to use a version of jersey which I bundled into my fat jar,however gateway node has an older version of that jar and it loads a class from there resulting in a NosSuchMethodException .My application is not a map reduce job and I run it by using hadoop jar and running on 5.14.4