Error in starting namenode in hadoop 2.4.1
Asked Answered
H

4

1

When I try to start dfs using:

start-dfs.sh

I get an error saying :

14/07/03 11:03:21 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable Starting namenodes on [OpenJDK 64-Bit Server VM
warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'. localhost] sed: -e
expression #1, char 6: unknown option to `s' Server: ssh: Could not
resolve hostname Server: Name or service not known
-c: Unknown cipher type 'cd' stack: ssh: Could not resolve hostname stack: Name or service not known 64-Bit: ssh: Could not resolve
hostname 64-Bit: Name or service not known guard.: ssh: Could not
resolve hostname guard.: Name or service not known The: ssh: Could not
resolve hostname The: Name or service not known guard: ssh: Could not
resolve hostname guard: Name or service not known might: ssh: Could
not resolve hostname might: Name or service not known stack: ssh:
Could not resolve hostname stack: Name or service not known will: ssh:
Could not resolve hostname will: Name or service not known the: ssh:
Could not resolve hostname the: Name or service not known fix: ssh:
Could not resolve hostname fix: Name or service not known VM: ssh:
Could not resolve hostname VM: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known It's: ssh:
Could not resolve hostname It's: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
try: ssh: Could not resolve hostname try: Name or service not known
localhost: namenode running as process 4463. Stop it first. library:
ssh: Could not resolve hostname library: Name or service not known
with: ssh: Could not resolve hostname with: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known VM: ssh: Could not resolve hostname VM: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known loaded: ssh: Could not resolve hostname loaded: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known <libfile>',: ssh: Could not resolve hostname
<libfile>',: Name or service not known to: ssh: connect to host to
port 22: Connection refused OpenJDK: ssh: Could not resolve hostname
OpenJDK: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known have: ssh: Could not resolve
hostname have: Name or service not known with: ssh: Could not resolve
hostname with: Name or service not known fix: ssh: Could not resolve
hostname fix: Name or service not known noexecstack'.: ssh: Could not
resolve hostname noexecstack'.: Name or service not known that: ssh:
Could not resolve hostname that: Name or service not known you: ssh:
Could not resolve hostname you: Name or service not known or: ssh:
Could not resolve hostname or: Name or service not known highly: ssh:
Could not resolve hostname highly: Name or service not known
recommended: ssh: Could not resolve hostname recommended: Name or
service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known link: ssh: Could not resolve
hostname link: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known '-z: ssh: Could not resolve
hostname '-z: Name or service not known localhost: datanode running as
process 4561. Stop it first. Starting secondary namenodes [OpenJDK
64-Bit Server VM warning: You have loaded library
/usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now. It's
highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'.
0.0.0.0] sed: -e expression #1, char 6: unknown option to `s' OpenJDK: ssh: Could not resolve hostname OpenJDK: Name or service not known
-c: Unknown cipher type 'cd' VM: ssh: Could not resolve hostname VM: Name or service not known The authenticity of host '0.0.0.0 (0.0.0.0)'
can't be established. ECDSA key fingerprint is
dd:64:53:7e:c0:62:40:c0:63:2b:5c:6d:1e:b6:cd:23. Are you sure you want
to continue connecting (yes/no)? might: ssh: Could not resolve
hostname might: Name or service not known Server: ssh: Could not
resolve hostname Server: Name or service not known guard.: ssh: Could
not resolve hostname guard.: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known You: ssh:
Could not resolve hostname You: Name or service not known The: ssh:
Could not resolve hostname The: Name or service not known which: ssh:
Could not resolve hostname which: Name or service not known have: ssh:
Could not resolve hostname have: Name or service not known disabled:
ssh: Could not resolve hostname disabled: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known
It's: ssh: Could not resolve hostname It's: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
the: ssh: Could not resolve hostname the: Name or service not known
warning:: ssh: Could not resolve hostname warning:: Name or service
not known will: ssh: Could not resolve hostname will: Name or service
not known the: ssh: Could not resolve hostname the: Name or service
not known library: ssh: Could not resolve hostname library: Name or
service not known that: ssh: Could not resolve hostname that: Name or
service not known highly: ssh: Could not resolve hostname highly: Name
or service not known 'execstack: ssh: Could not resolve hostname
'execstack: Name or service not known try: ssh: Could not resolve
hostname try: Name or service not known guard: ssh: Could not resolve
hostname guard: Name or service not known 64-Bit: ssh: Could not
resolve hostname 64-Bit: Name or service not known loaded: ssh: Could
not resolve hostname loaded: Name or service not known library: ssh:
Could not resolve hostname library: Name or service not known fix:
ssh: Could not resolve hostname fix: Name or service not known to:
ssh: connect to host to port 22: Connection refused link: ssh: Could
not resolve hostname link: Name or service not known stack: ssh: Could
not resolve hostname stack: Name or service not known '-z: ssh: Could
not resolve hostname '-z: Name or service not known you: ssh: Could
not resolve hostname you: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known with: ssh: Could
not resolve hostname with: Name or service not known recommended: ssh:
Could not resolve hostname recommended: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not
known now.: ssh: Could not resolve hostname now.: Name or service not
known <libfile>',: ssh: Could not resolve hostname <libfile>',: Name
or service not known or: ssh: Could not resolve hostname or: Name or
service not known noexecstack'.: ssh: Could not resolve hostname
noexecstack'.: Name or service not known it: ssh: Could not resolve
hostname it: Name or service not known ^C0.0.0.0: Host key
verification failed. ^C

My core-site.xml file contains this:

<configuration>
    <property>
       <name>fs.default.name</name>
       <value>hdfs://localhost:9000</value>
    </property>
</configuration>

My .profile (replacement for .bashrc) contains these lines:

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
export HADOOP_INSTALL=/usr/local/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

And I can easily ssh my localhost saying:

ssh localhost

Welcome to Linux Mint 16 Petra (GNU/Linux
3.11.0-12-generic x86_64)

Welcome to Linux Mint  * Documentation:  http://www.linuxmint.com Last
login: Wed Jul  2 16:51:15 2014 from localhost
Honestly answered 2/7, 2014 at 7:7 Comment(7)
Please include in your post the content of your core-site.xml file.Polik
@Polik : I have added the contents of the core-site.xml fileHonestly
Check the hadoop-env.sh file, and see if the HADOOP_CONF_DIR is being set, and where does it point to.Polik
show the line export HADOOP_HOME= which you inserted in $HOME/.bashrcLunge
@Polik : Sorry for the delay...i have edited my question including the current error messagesHonestly
For searchability: this problem also applies to Hadoop 2.4.0.Outfield
possible duplicate of hadoop 2.2.0 64-bit installing but cannot startOutfield
O
7

Stop JVM from printing the stack guard warning to stdout/stderr, because this is what breaks the HDFS starting script.


Do it by replacing in your etc/hadoop/hadoop-env.sh line:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true"

with:

export HADOOP_OPTS="$HADOOP_OPTS -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true"


(This solution has been found on Sumit Chawla's blog)

Outfield answered 20/9, 2015 at 11:32 Comment(0)
P
1

Edit your .bashrc file and add the following lines:

export HADOOP_HOME=path_to_your_hadoop_folder
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

And although your ssh should be working by what you have just said, do it again just in case:

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
Polik answered 3/7, 2014 at 8:8 Comment(1)
I'm using linux mint and so I have .profile instead of .bashrc As I have mentioned above, all the above mentioned 3 lines are already present in my .profile file(Check the second last block quote).Honestly
F
1

It seems like you haven't added the $HADOOP_INSTALL line in your .profile file that points to your main hadoop folder. As Balduz suggests using the HADOOP_HOME will work in place of the $HADOOP_INSTALL variable. I would use his suggestion but you can also fix it by adding...

export HADOOP_INSTALL=/path/to/hadoop/
Fotina answered 6/8, 2014 at 15:34 Comment(0)
K
0

please check your HADOOP_CONF_DIR (most likely in bashrc) It should be pointing to $HADOOP_HOME/etc/hadoop

Kaylyn answered 16/2, 2019 at 3:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.