Hadoop “Unable to load native-hadoop library for your platform” error on docker-spark?
Asked Answered
M

2

17

I am using docker-spark. After starting spark-shell, it outputs:

15/05/21 04:28:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError:no hadoop in java.library.path
15/05/21 04:28:22 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib

The environment variables of this spark container are:

bash-4.1# export
declare -x BOOTSTRAP="/etc/bootstrap.sh"
declare -x HADOOP_COMMON_HOME="/usr/local/hadoop"
declare -x HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
declare -x HADOOP_HDFS_HOME="/usr/local/hadoop"
declare -x HADOOP_MAPRED_HOME="/usr/local/hadoop"
declare -x HADOOP_PREFIX="/usr/local/hadoop"
declare -x HADOOP_YARN_HOME="/usr/local/hadoop"
declare -x HOME="/"
declare -x HOSTNAME="sandbox"
declare -x JAVA_HOME="/usr/java/default"
declare -x OLDPWD
declare -x PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/java/default/bin:/usr/local/spark/bin:/usr/local/hadoop/bin"
declare -x PWD="/"
declare -x SHLVL="3"
declare -x SPARK_HOME="/usr/local/spark"
declare -x SPARK_JAR="hdfs:///spark/spark-assembly-1.3.0-hadoop2.4.0.jar"
declare -x TERM="xterm"
declare -x YARN_CONF_DIR="/usr/local/hadoop/etc/hadoop"

After referring Hadoop “Unable to load native-hadoop library for your platform” error on CentOS, I have done the following:

(1) Check the hadoop library:

bash-4.1# file /usr/local/hadoop/lib/native/libhadoop.so.1.1.0
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

Yes, it is 64-bit library.

(2) Try adding the HADOOP_OPTS environment variable:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"

It doesn't work, and reports the same error.

(3) Try adding the HADOOP_OPTS and HADOOP_COMMON_LIB_NATIVE_DIR environment variable:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

It still doesn't work, and reports the same error.

Could anyone give some clues about the issue?

Myosin answered 21/5, 2015 at 9:12 Comment(0)
M
41

Adding the Hadoop library into LD_LIBRARY_PATH fix this problem:

export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH"
Myosin answered 22/5, 2015 at 9:21 Comment(4)
thanks, that solved my problem. I wonder how this got to happen, since I wasn't getting the error after a clean Spark installation...Unbated
Strange, it's not mentioned in the Spark doc (spark.apache.org/docs/latest/hadoop-provided.html) but it works! Thanks.Sasin
I had to add log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG to my $HADOOP_HOME/etc/hadoop/log4j.properties. After that I could see the following error: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path. This is the working solution for that error, just add the line to your user's .profile/.bashrc. Thank you!Ecumenicist
I had to use export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" to make it work.Marius
F
1

Because the step (3) path is an error. It should be:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"

Then it works.


Debug this issue is very easy, edit the log4j profile $HADOOP_HOME/etc/hadoop/log4j.properties Add a new line

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG

Then exec the command

hdfs dfsadmin -report

The log shows the path is an error.

2022-10-10 16:32:30,599 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
2022-10-10 16:32:30,601 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
2022-10-10 16:32:30,601 DEBUG util.NativeCodeLoader: java.library.path=/opt/hadoop/lib
Flatter answered 27/10, 2022 at 8:46 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.