This site uses cookies to deliver our services and to show you relevant ads and job listings.
By using our site, you acknowledge that you have read and understand our Cookie Policy, Privacy Policy, and our Terms of Service.
Your use of Stack Overflow’s Products and Services, including the Stack Overflow Network, is subject to these policies and terms.

Join us in building a kind, collaborative learning community via our updated
Code of Conduct.

Even if this doesn't exactly work, it's still helpful. So will this impact performance, at all?
– WattsInABoxJul 22 '14 at 18:15

1

I am using same hadoop 2.5.0 tar on Centos 7 and Centos 6.5. Both are 64 bit OS. There is no such warning on Centos7 but Centos 6.5 gives me this warning, why ?
– sandip divekarJan 8 '15 at 5:40

Thanks. I did not realize that it is a warning. Actually says "starting namenode" and last sentence is "Unable to load native-hadoop .." which caused fear.
– Kaushik LeleAug 5 '15 at 12:25

Note that you actually don't have to compile whole Hadoop, as the instructions suggest - hadoop-common-project/hadoop-common and hadoop-hdfs-project/hadoop-hdfs is enough.
– Greg DubickiSep 20 '15 at 12:43

I think, two solutions are the same. According to doc, java.library.path is a list of paths to search when loading the libraries. So that, you can export LD_LIBRARY_PATH or use -D option in java command line. In java command line, and -D<property>=value allow us to set a system property value.
– Hoai-Thu VuongAug 8 at 2:37

Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":

`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)

So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):

Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:

15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)

And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:

`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)

What version of GLIBC do I have? Here's simple trick to find out:

[hadoop@VMWHADTEST01 hadoop]$ ldd --version
ldd (GNU libc) 2.12

So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.

In my case , after I build hadoop on my 64 bit Linux mint OS, I replaced the native library in hadoop/lib. Still the problem persist. Then I figured out the hadoop pointing to hadoop/lib not to the hadoop/lib/native. So I just moved all content from native library to its parent. And the warning just gone.

I just happened to have tried everything on the net. I got tired and just emptied all the files in the lib folder itself i.e the ones compiled using the links provided in the above answer. Finally I don't know why despite the downvotes you've got I tried your suggestion and it worked after a tremendous struggle I put up for a day behind all this.It didn't matter whether I changed the native library location in .bashrc or hadoop-env.sh. Thanks a tonne.
– Akshay HazariJun 25 '14 at 12:39

I got tired and just emptied all the native folder files in the lib folder itself i.e the ones compiled using the links provided in the above answer (native folder in the new hadoop-2.4.0-src.tar.gz.)
– Akshay HazariJun 25 '14 at 12:47

@zhutoulala -- FWIW your links worked for me with Hadoop 2.4.0 with one exception I had to tell maven not to build the javadocs. I also used the patch in the first link for 2.4.0 and it worked fine. Here's the maven command I had to issue

mvn package -Dmaven.javadoc.skip=true -Pdist,native -DskipTests -Dtar

After building this and moving the libraries, don't forget to update hadoop-env.sh :)

Thought this might help someone who ran into the same roadblocks as me

I am using hadoop-2.6.0 version in my local system. I was also facing same issue. Then I downloaded the hadoop-2.7.1-src and build binary and natives libraries, also replaced the native libraries hadoop-2.6.0 with the newly builded natives. But still I was getting same errors. Then I export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH and it worked for me.
– ParagFlumeFeb 2 '16 at 17:16

In addition to @zhutoulala accepted answer, here is an update to make it work with latest stable version to date (2.8) on ARMHF platforms (Raspberry Pi 3 model B).
First I can confirm that you must recompile native libraries to 64 bit ARM, other answers here based on setting some environment variables won't work. As indicated in Hadoop documentation, the pre-built native libraries are 32 bit.

CMake file patching method must be changed. Moreovere, files to patch are not the same. Unfortunately, there is no accepted patch on JIRA specific to 2.8. On this URL (https://issues.apache.org/jira/browse/HADOOP-9320) you must copy and paste Andreas Muttscheller proposed patch on your namenode :

For installing Hadoop it is soooooo much easier installing the free version from Cloudera. It comes with a nice GUI that makes it simple to add nodes, there is no compiling or stuffing around with dependencies, it comes with stuff like hive, pig etc.

Steps are:
1) Download
2) Run it
3) Go to web GUI (1.2.3.4:7180)
4) Add extra nodes in the web gui (do NOT install the cloudera software on other nodes, it does it all for you)
5) Within the web GUI go to Home, click Hue and Hue Web UI. This gives you access to Hive, Pig, Sqoop etc.

Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).