hadoop java.io.IOException: while running namenode -format
Asked Answered
D

5

12

I ran namenode -format.This is my output. I tried changing the file permissions chmod 777 hadoop.

I believe this line is the error ERROR namenode.NameNode: java.io.IOException: Cannot create directory /your/path/to/hadoop/tmp/dir/hadoop-hadoop/dfs/name/current

adoop@alexander-desktop:/usr/local/hadoop/bin$ ./hadoop namenode -format
12/07/03 17:03:56 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = alexander-desktop/127.0.1.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 0.20.2
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
12/07/03 17:03:56 INFO namenode.FSNamesystem: fsOwner=hadoop,hadoop
12/07/03 17:03:56 INFO namenode.FSNamesystem: supergroup=supergroup
12/07/03 17:03:56 INFO namenode.FSNamesystem: isPermissionEnabled=true
12/07/03 17:03:56 ERROR namenode.NameNode: java.io.IOException: Cannot create directory /your/path/to/hadoop/tmp/dir/hadoop-hadoop/dfs/name/current
    at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:295)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1086)
    at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1110)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:856)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:948)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965)

12/07/03 17:03:56 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at alexander-desktop/127.0.1.1


hadoop@alexander-desktop:/usr/local$ sudo hadoop/bin/hadoop namenode -format
sudo: /etc/sudoers is mode 0777, should be 0440
sudo: no valid sudoers sources found, quitting
Distrust answered 3/7, 2012 at 16:24 Comment(2)
you may probably need to stop your all daemons first before performing format on namenode..!Mockery
There is an error in your hdfs-site.xml; there is no such path called /your/path/to/hadoop/tmp/dir/... It is possible that you followed a template for your hdfs-site.xml but forgot to edit the parameters. You can refer to this post for a similar problem I faced and its solution.Hollyanne
G
18

try with sudo (I realize you changed permissions) but I would still try sudo and check if that resolves the problem.

Gidgetgie answered 3/7, 2012 at 16:47 Comment(6)
i fixed the error above by changing the permission level for sudoers. The original error has not been resolved. ok I now have a problem with $JAVA_HOME can I use export $JAVA_HOME and the path to fix the errorDistrust
@alex: Did you set $JAVA_HOME in conf/hadoop-env.sh? Check if your environment has this JAVA_HOME set. Note that sudo user environment and you environment are different. Use commands printev and sudo printenv.Leanoraleant
thanks that fixed the problem I also experimented with settings various environmental variables so i found it worthwhile learning experience. What confused me is the tutorial said you could set either hadoop-env.sh or bash.bashrc file.Distrust
What is the best method is it to enviroment variables or is it better to set profile.d file.Distrust
@Distrust What permission did you change for sudoers file? Iam facing the same problemOverlie
@Leanoraleant I have tried using sudo but still it is going back to /dfs/nn whereas i have given a different location in the core-site.xml .Kindly helpOverlie
E
13

Following steps resolved my problem -

1- sudo su

Enter your password.

2-/usr/local/hadoop/bin/hdfs namenode -format.

This has been done for hadoop2.5 in which "hadoop namenode -format" has beendeprecated hence using "hdfs namenode -format"

Essence answered 1/9, 2014 at 5:23 Comment(2)
Worked for me, thank you! The accepted answer didn't work in my casePenelopa
same for me .. The accepted answer didn't work in my case.. this worked~Preordain
R
2

Check hdfs-site.xml configuration, it may has a wrong path for properties dfs.namenode.name.dir and dfs.datanode.data.dir In my case it was the cause of the problem (directory was situated in other then current user's home folder).

Razzia answered 12/4, 2014 at 14:43 Comment(1)
I think everyone is going nuclear with sudo - the error message says: java.io.IOException: Cannot create directory /your/path/to/hadoop/tmp/dir/hadoop-hadoop/dfs/name/current -- wouldn't this mean that what ever place holder value for the datadir directory doesn't exist? Check your default values in core-site.xml of hdfs-site.xml first?Hoofbeat
G
2

This is a permissions issue. Either you can use 1. sudo 2. login as root

But the best solution is

sudo chown $HADOOP_HOME

hadoop namenode -format

Where HADOOP_HOME is your hadoop installation directory

Grainfield answered 26/11, 2014 at 11:48 Comment(1)
sudo chown $HADOOP_HOME to what permission?Overlie
S
0

sudo is broken in this situation, but pkexec (the command-line frontend to PolicyKit) still works, so you can fix it with a single command. No rebooting is necessary.

pkexec chmod 0440 /etc/sudoers

This assumes PolicyKit is installed. If this is a desktop system (rather than a server with no GUI), it is.

Seedtime answered 9/8, 2013 at 17:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.