hadoop 2.2.0 64-bit installing but cannot start
Asked Answered
P

8

19

I am trying to install Hadoop 2.2.0 Cluster on the servers. For now all the servers are 64-bit, I download the Hadoop 2.2.0 and all the configuration files have been set up. When I am running ./start-dfs.sh, I got the following error:

13/11/15 14:29:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hchen/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.namenode]
sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
-c: Unknown cipher type 'cd'
Java: ssh: Could not resolve hostname Java: Name or service not known
The authenticity of host 'namenode (192.168.1.62)' can't be established.
RSA key fingerprint is 65:f9:aa:7c:8f:fc:74:e4:c7:a2:f5:7f:d2:cd:55:d4.
Are you sure you want to continue connecting (yes/no)? VM: ssh: Could not resolve        hostname VM: Name or service not known
You: ssh: Could not resolve hostname You: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
64-Bit: ssh: Could not resolve hostname 64-Bit: Name or service not known
...

Beside the 64-bit, is there any other errors? I have finished the log in between namenode and datanodes without password, what do the other errors mean?

Plica answered 15/11, 2013 at 21:53 Comment(1)
For searchability: this problem also applies to Hadoop 2.4.0 and Hadoop 2.4.1.Charmaincharmaine
M
22

Add the following entries to .bashrc where HADOOP_HOME is your hadoop folder:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

In addition, execute the following commands:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
Mose answered 27/11, 2013 at 12:38 Comment(0)
T
9

The root cause is that the default native library in hadoop is built for 32-bit. The solution

1) Setup some environment variables in .bash_profile. Please refer to https://gist.github.com/ruo91/7154697 Or

2) Rebuild your hadoop native library, please refer to http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/NativeLibraries.html

Toxoid answered 4/12, 2013 at 9:42 Comment(1)
welcome to SO. Please provide more details from mentioned links. It will help troubleshoot problem in case they become unavailablePitchman
S
4

You also can export variables in hadoop-env.sh

vim /usr/local/hadoop/etc/hadoop/hadoop-env.sh

/usr/local/hadoop - my hadoop installation folder

#Hadoop variables
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-amd64 # your jdk install path
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
Soutane answered 5/6, 2014 at 7:27 Comment(1)
change export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib" to export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib/native"Benzocaine
C
2

I think that the only problem here is the same as in this question, so the solution is also the same:


Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.


Do it by replacing in your etc/hadoop/hadoop-env.sh line:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

with:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


(This solution has been found on Sumit Chawla's blog)

Charmaincharmaine answered 20/9, 2015 at 11:38 Comment(2)
stack guard problem mostly happen on the x64 archetecture.Benzocaine
any idea about this error. WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicableBenzocaine
T
0

The issue is not with the native library. Please see that its just a warning . Please export the hadoop variables mentioned above . That will work

Thiol answered 24/12, 2013 at 10:37 Comment(0)
M
0

You have three issues:

  1. " Unable to load native-hadoop library" just as @Nogard said. His answer solves this problem.
  2. "The authenticity of host 'namenode (192.168.1.62)' can't be established." is because you have no ssh authentication. Do this:

    ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys chmod 600 ~/.ssh/authorized_keys scp ~/.ssh/authorized_keys [email protected]:/home/your_install_user/.ssh/

  3. "sed: -e expression #1, char 6: unknown option to `s' have: ssh: Could not resolve hostname have: Name or service not known HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known -c: "

    Try this : edit your .bash_profile or .bashrc and put this into it:

    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
    

    and source .bash_profile or source .bashr to make the change effective immediately.

Marabou answered 22/5, 2015 at 6:15 Comment(0)
W
0

I had the similar problem & could not solve it after following all the above suggestions.

Finally understood that, the hostname configured and IP address is not assigned for the same.

My hostname was vagrant and it is configured in /etc/hostname. But I found that the IP address for the vagrant is not assigned in /etc/hosts. In /etc/hosts I found IP address for only localhost.

Once I updated the hostname for both localhost and vagrant all the above problems are resolved.

Worlock answered 30/5, 2016 at 6:28 Comment(0)
C
-1

Make sure your HADOOP_HOME and HADOOP_PREFIX is properly set. I had this issue. Also, the ssh passwordless needs to be properly set up.

Coactive answered 13/1, 2016 at 2:26 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.