Hadoop "Unable to load native-hadoop library for your platform" warning
Asked Answered
S

24

320

I'm currently configuring hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I'm running Hadoop 2.2.0.

Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html

However, the contents of /native/ directory on hadoop 2.x appear to be different so I am not sure what to do.

I've also added these two environment variables in hadoop-env.sh:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

Any ideas?

Spitter answered 13/11, 2013 at 1:53 Comment(2)
For searchability: this problem also applies at least to Hadoop 2.4.0, Hadoop 2.4.1 and probably other versions.Pyxidium
Documentation for how to use native libraries is at hadoop.apache.org/docs/current/hadoop-project-dist/…Bradeord
M
256

I assume you're running Hadoop on 64bit CentOS. The reason you saw that warning is the native Hadoop library $HADOOP_HOME/lib/native/libhadoop.so.1.0.0 was actually compiled on 32 bit.

Anyway, it's just a warning, and won't impact Hadoop's functionalities.

Here is the way if you do want to eliminate this warning, download the source code of Hadoop and recompile libhadoop.so.1.0.0 on 64bit system, then replace the 32bit one.

Steps on how to recompile source code are included here for Ubuntu:

Maurya answered 15/11, 2013 at 4:12 Comment(8)
Doesn't Work for me. Gives me the same Unable to load native-hadoop library for your platform error.Teaching
Even if this doesn't exactly work, it's still helpful. So will this impact performance, at all?Lubricate
I am using same hadoop 2.5.0 tar on Centos 7 and Centos 6.5. Both are 64 bit OS. There is no such warning on Centos7 but Centos 6.5 gives me this warning, why ?Oblate
Thanks. I did not realize that it is a warning. Actually says "starting namenode" and last sentence is "Unable to load native-hadoop .." which caused fear.Getz
Note that you actually don't have to compile whole Hadoop, as the instructions suggest - hadoop-common-project/hadoop-common and hadoop-hdfs-project/hadoop-hdfs is enough.Pyxidium
for me it was just glitch in terminal when it had resumed the session after a mac restart. Quit the terminal app and restarted again ..error was gone and was able to start spark-shell.Morrow
The link is brokenCyclamate
The file might be 64bit, which is the standard now. To find out: #19445404Cyclamate
C
175

Just append word native to your HADOOP_OPTS like this:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"

PS: Thank Searene

Crustacean answered 24/7, 2014 at 7:10 Comment(4)
This did it for me also. On Ubuntu with Hadoop 2.6, the path was /home/user/hadoop-2.6.0/lib/nativeRectifier
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"Voracious
I think, two solutions are the same. According to doc, java.library.path is a list of paths to search when loading the libraries. So that, you can export LD_LIBRARY_PATH or use -D option in java command line. In java command line, and -D<property>=value allow us to set a system property value.Crustacean
this is the correct solution for me. It fixed the warningTomikotomkiel
H
62

The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:

/opt/hadoop/lib/native/libhadoop.so.1.0.0

And I know it is 64-bit:

[hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0
./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
linux-vdso.so.1 =>  (0x00007fff43510000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000)
libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000)
/lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)

Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":

`GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)

So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):

15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop    library for your platform... using builtin-java classes where applicable

So, off to here to see what it does:

http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/

Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:

15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)

And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:

`GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)

What version of GLIBC do I have? Here's simple trick to find out:

[hadoop@VMWHADTEST01 hadoop]$ ldd --version
ldd (GNU libc) 2.12

So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.

Hepatica answered 18/6, 2015 at 23:45 Comment(1)
Thank you sir for this beautifully detailed answer. I got my answer and learned something valuable (a few somethings) in the process.Inculpate
C
30

I had the same issue. It's solved by adding following lines in .bashrc:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
Cromlech answered 20/7, 2014 at 19:57 Comment(2)
I had to add "/native" to HADOOP_OPTS valueHomocercal
doesn't work for me. added /native to HADOOP_OPTS in .zshrc and sourced it, no dicePliocene
B
24

In my case , after I build hadoop on my 64 bit Linux mint OS, I replaced the native library in hadoop/lib. Still the problem persist. Then I figured out the hadoop pointing to hadoop/lib not to the hadoop/lib/native. So I just moved all content from native library to its parent. And the warning just gone.

Beltz answered 11/6, 2014 at 6:13 Comment(2)
I just happened to have tried everything on the net. I got tired and just emptied all the files in the lib folder itself i.e the ones compiled using the links provided in the above answer. Finally I don't know why despite the downvotes you've got I tried your suggestion and it worked after a tremendous struggle I put up for a day behind all this.It didn't matter whether I changed the native library location in .bashrc or hadoop-env.sh. Thanks a tonne.Teaching
I got tired and just emptied all the native folder files in the lib folder itself i.e the ones compiled using the links provided in the above answer (native folder in the new hadoop-2.4.0-src.tar.gz.)Teaching
G
19

This also would work:

export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native
Gland answered 28/11, 2014 at 14:54 Comment(2)
Thanks. If you override LD_LIBRARY_PATH in order to use tomcat apr, just append hadoop native path as `export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/hadoop/lib/native.Bulbiferous
this only works solution for me. (tried all other answers).Lissettelissi
D
14

After a continuous research as suggested by KotiI got resolved the issue.

hduser@ubuntu:~$ cd /usr/local/hadoop

hduser@ubuntu:/usr/local/hadoop$ ls

bin  include  libexec      logs        README.txt  share
etc  lib      LICENSE.txt  NOTICE.txt  sbin

hduser@ubuntu:/usr/local/hadoop$ cd lib

hduser@ubuntu:/usr/local/hadoop/lib$ ls
native

hduser@ubuntu:/usr/local/hadoop/lib$ cd native/

hduser@ubuntu:/usr/local/hadoop/lib/native$ ls

libhadoop.a       libhadoop.so        libhadooputils.a  libhdfs.so
libhadooppipes.a  libhadoop.so.1.0.0  libhdfs.a         libhdfs.so.0.0.0

hduser@ubuntu:/usr/local/hadoop/lib/native$ sudo mv * ../

Cheers

Detonate answered 28/12, 2015 at 19:52 Comment(0)
P
14
export JAVA_HOME=/home/hadoop/software/java/jdk1.7.0_80
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"
Polytypic answered 28/9, 2016 at 5:30 Comment(0)
H
13

For those on OSX with Hadoop installed via Homebrew, follow these steps replacing the path and Hadoop version where appropriate

wget http://www.eu.apache.org/dist/hadoop/common/hadoop-2.7.1/hadoop-2.7.1-src.tar.gz
tar xvf hadoop-2.7.1-src.tar.gz
cd hadoop-2.7.1-src
mvn package -Pdist,native -DskipTests -Dtar
mv lib /usr/local/Cellar/hadoop/2.7.1/

then update hadoop-env.sh with

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc= -Djava.library.path=/usr/local/Cellar/hadoop/2.7.1/lib/native"
Henslowe answered 9/12, 2015 at 11:54 Comment(2)
Thanks Philip. This solution worked perfect. In my case, All I needed was the option Djava.library.path. That was exactly what I was looking for. Thanks!!!Tacy
Thanks a lot.I have bzip2: false , openssl: false build does not support openssl. The others have path showing up. Any suggestions.Spearing
M
9

@zhutoulala -- FWIW your links worked for me with Hadoop 2.4.0 with one exception I had to tell maven not to build the javadocs. I also used the patch in the first link for 2.4.0 and it worked fine. Here's the maven command I had to issue

mvn package -Dmaven.javadoc.skip=true -Pdist,native -DskipTests -Dtar

After building this and moving the libraries, don't forget to update hadoop-env.sh :)

Thought this might help someone who ran into the same roadblocks as me

Mcglynn answered 26/4, 2014 at 5:47 Comment(0)
T
6

Move your compiled native library files to $HADOOP_HOME/lib folder.

Then set your environment variables by editing .bashrc file

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib  
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib"

Make sure your compiled native library files are in $HADOOP_HOME/lib folder.

it should work.

Tasteful answered 12/10, 2014 at 18:56 Comment(0)
I
3
export HADOOP_HOME=/home/hadoop/hadoop-2.4.1  
export PATH=$HADOOP_HOME/bin:$PATH  
export HADOOP_PREFIX=$HADOOP_HOME  
export HADOOP_COMMON_HOME=$HADOOP_PREFIX  
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_PREFIX/lib/native  
export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop  
export HADOOP_HDFS_HOME=$HADOOP_PREFIX  
export HADOOP_MAPRED_HOME=$HADOOP_PREFIX  
export HADOOP_YARN_HOME=$HADOOP_PREFIX  
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
Impudent answered 15/8, 2014 at 1:10 Comment(1)
yes,you should have recompile 64bit lib/native via hadoop resource.Impudent
U
3

This line right here:

export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH

From KunBetter's answer, worked for me. Just append it to .bashrc file and reload .bashrc contents

$ source ~/.bashrc
Urbanity answered 27/9, 2015 at 13:21 Comment(1)
I am using hadoop-2.6.0 version in my local system. I was also facing same issue. Then I downloaded the hadoop-2.7.1-src and build binary and natives libraries, also replaced the native libraries hadoop-2.6.0 with the newly builded natives. But still I was getting same errors. Then I export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH and it worked for me.Triboelectricity
J
2

This line right here:

export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH

From KunBetter's answer is where the money is

Jem answered 7/4, 2015 at 23:31 Comment(1)
In my case I needed both: export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH and export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$LD_LIBRARY_PATHDiestock
M
2

In addition to @zhutoulala accepted answer, here is an update to make it work with latest stable version to date (2.8) on ARMHF platforms (Raspberry Pi 3 model B). First I can confirm that you must recompile native libraries to 64 bit ARM, other answers here based on setting some environment variables won't work. As indicated in Hadoop documentation, the pre-built native libraries are 32 bit.

High level steps given in the fist link (http://www.ercoppa.org/posts/how-to-compile-apache-hadoop-on-ubuntu-linux.html) are correct. On this url http://www.instructables.com/id/Native-Hadoop-260-Build-on-Pi/ you get more details specific to Raspberry Pi, but not for Hadoop version 2.8.

Here are my indications pour Hadoop 2.8 :

  • there is still no protobuf package on latest Raspbian so you must compile it yourself and version must be exactly protobuf 2.5 (https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz)
  • CMake file patching method must be changed. Moreovere, files to patch are not the same. Unfortunately, there is no accepted patch on JIRA specific to 2.8. On this URL (https://issues.apache.org/jira/browse/HADOOP-9320) you must copy and paste Andreas Muttscheller proposed patch on your namenode :

    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ touch HADOOP-9320-v2.8.patch
    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ vim HADOOP-9320-v2.8.patch
    #copy and paste proposed patch given here : https://issues.apache.org/jira/browse/HADOOP-9320?focusedCommentId=16018862&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16018862
    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ patch < HADOOP-9320-v2.8.patch
    patching file HadoopCommon.cmake
    patching file HadoopJNI.cmake
    :hadoop-2.8.0-src/hadoop-common-project/hadoop-common $ cd ../..
    :hadoop-2.8.0-src $ sudo mvn package -Pdist,native -DskipTests -Dtar
    

Once build is successful :

    :hadoop-2.8.0-src/hadoop-dist/target/hadoop-2.8.0/lib/native $ tar -cvf nativelibs.tar *

And replace the content of the lib/native directory of your Hadoop install with the content of this archive. Warning message when running Hadoop should disappear.

Mitre answered 10/7, 2017 at 9:18 Comment(0)
S
2

This answer is a mix between @chromeeagle's analysis and this link (Nan-Xiao).

For those who the other solutions simply won't work, please follow these steps:

  1. Edit the file $HADOOP_HOME/etc/hadoop/log4j.properties (credits to @chromeeagle). Add the line at the end:

    log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

  2. Launch your spark/pyspark shell. You will see additional log information regarding the native library not loading. In my case I had the following error:

    Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path

  3. To fix this specific problem, add the Hadoop native library path to the LD_LIBRARY_PATH environment variable in your user's profile:

    export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native:$LD_LIBRARY_PATH"

Hope this helps. I had this issue in a couple of HADOOP installations, it worked on both.

Snowstorm answered 7/7, 2021 at 17:2 Comment(0)
E
1

I had the same problem with JDK6,I changed the JDK to JDK8,the problem solved. Try to use JDK8!!!

Expiratory answered 17/6, 2016 at 3:18 Comment(0)
R
1

I'm not using CentOS. Here is what I have in Ubuntu 16.04.2, hadoop-2.7.3, jdk1.8.0_121. Run start-dfs.sh or stop-dfs.sh successfully w/o error:

# JAVA env
#
export JAVA_HOME=/j01/sys/jdk
export JRE_HOME=/j01/sys/jdk/jre

export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:${PATH}:.

# HADOOP env
#
export HADOOP_HOME=/j01/srv/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME

export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

Replace /j01/sys/jdk, /j01/srv/hadoop with your installation path

I also did the following for one time setup on Ubuntu, which eliminates the need to enter passwords for multiple times when running start-dfs.sh:

sudo apt install openssh-server openssh-client
ssh-keygen -t rsa
ssh-copy-id user@localhost

Replace user with your username

Resistive answered 24/3, 2017 at 18:1 Comment(0)
E
1

Basically, it is not an error, it's a warning in the Hadoop cluster. Here just we update the environment variables.

export HADOOP_OPTS = "$HADOOP_OPTS"-Djava.library.path = /usr/local/hadoop/lib
 export HADOOP_COMMON_LIB_NATIVE_DIR = "/usr/local/hadoop/lib/native"
Ellswerth answered 7/6, 2020 at 17:52 Comment(0)
C
0

Verified remedy from earlier postings:

1) Checked that the libhadoop.so.1.0.0 shipped with the Hadoop distribution was compiled for my machine architecture, which is x86_64:

[nova]:file /opt/hadoop-2.6.0/lib/native/libhadoop.so.1.0.0
/opt/hadoop-2.6.0/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=3a80422c78d708c9a1666c1a8edd23676ed77dbb, not stripped

2) Added -Djava.library.path=<path> to HADOOP_OPT in hadoop-env.sh:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.library.path=/opt/hadoop-2.6.0/lib/native"

This indeed made the annoying warning disappear.

Conney answered 24/7, 2015 at 16:19 Comment(0)
G
0

Firstly: You can modify the glibc version.CentOS provides safe softwares tranditionally,it also means the version is old such as glibc,protobuf ...

ldd --version
ldd /opt/hadoop/lib/native/libhadoop.so.1.0.0

You can compare the version of current glibc with needed glibc.

Secondly: If the version of current glibc is old,you can update the glibc. DownLoad Glibc

If the version of current glibc id right,you can append word native to your HADOOP_OPTS

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
Gentilis answered 27/12, 2015 at 4:36 Comment(0)
E
0

Added Hadoop native library to LD_LIBRARY_PATH on .bashrc file and reload the library into the current session using source ~/.bashrc

export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native

(OR) If you have hadoop library installed at /usr/lib/

export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native
Evapotranspiration answered 10/8, 2022 at 16:19 Comment(0)
L
-1

The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.

Refs: https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html

If you are using Windows or Mac OS X, you need to change your platform to *nix.

Lympho answered 27/3, 2021 at 7:30 Comment(1)
This is not true. It can work fine on Mac or Windows, if compiled manually, or downloaded from someone who hasCertainty
E
-2

For installing Hadoop it is soooooo much easier installing the free version from Cloudera. It comes with a nice GUI that makes it simple to add nodes, there is no compiling or stuffing around with dependencies, it comes with stuff like hive, pig etc.

http://www.cloudera.com/content/support/en/downloads.html

Steps are: 1) Download 2) Run it 3) Go to web GUI (1.2.3.4:7180) 4) Add extra nodes in the web gui (do NOT install the cloudera software on other nodes, it does it all for you) 5) Within the web GUI go to Home, click Hue and Hue Web UI. This gives you access to Hive, Pig, Sqoop etc.

Ezana answered 20/3, 2014 at 23:46 Comment(1)
Cloudera distributions are many a time behind the current versions available for many of the packages. if you want "latest and greatest", Apache Hadoop is the way to goDivulgate

© 2022 - 2024 — McMap. All rights reserved.