I opened up localhost:9870 and try to upload a txt file to the hdfs.
I see the error message below
Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error
I opened up localhost:9870 and try to upload a txt file to the hdfs.
I see the error message below
Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error
I had the same issue with JDK 9. The fix for me was to add this line in hadoop-env.sh
export HADOOP_OPTS="--add-modules java.activation"
Thats because java.activation package is deprecated in Java 9.
I got this to work with OpenJDK 13 by downloading hadoop 2.9.2 and copying the activation-1.1.jar
file from that download into the $HADOOP_HOME/share/hadoop/yarn
folder you're using for Hadoop 3. Then you have to run stop-dfs.sh
and stop-yarn.sh
and then start them both again. No need to edit any config files with this method since it will automatically be added to the classpath.
Just solved such a problem, I have multiple java versions and hadoop3.1.0.
you need to specify the java home variable in etc/hadoop/hadoop-env.sh, and the java version should be 1.8.
This occurs due to conflicting versions of Java and OpenJDK installed as a dependency of Homebrew. So it would be nice to uninstall Java. using the commands below
#!/bin/bash
sudo rm -rvf /Library/Java/JavaVirtualMachines/jdk<version>.jdk
sudo rm -rvf /Library/PreferencePanes/JavaControlPanel.prefPane
sudo rm -rvf /Library/Internet\ Plug-Ins/JavaAppletPlugin.plugin
sudo rm -rvf /Library/LaunchAgents/com.oracle.java.Java-Updater.plist
sudo rm -rvf /Library/PrivilegedHelperTools/com.oracle.java.JavaUpdateHelper
sudo rm -rvf /Library/LaunchDaemons/com.oracle.java.JavaUpdateHelper.plist
sudo rm -rvf /Library/Preferences/com.oracle.java.Helper-Tool.plist
you can check with [this link][1]
Create symlink to point to Homebrew open JDK dependency
sudo ln -sfn $(brew --prefix)/opt/openjdk@11/libexec/openjdk.jdk /Library/Java/JavaVirtualMachines/openjdk-11.jdk
Check for Java path using
$ /usr/libexec/java_home
It generates a path like this
/opt/homebrew/Cellar/openjdk@11/11.0.18/libexec/openjdk.jdk/Contents/Home
update the Hadoop environment file with the OpenJDK path by using this command in the terminal
$cd /opt/homebrew/Cellar/hadoop/3.3.1/libexec/etc/hadoop $code hadoop-env.sh
update the JAVA_HOME path with the following
export JAVA_HOME=/opt/homebrew/Cellar/openjdk@11/11.0.18/libexec/openjdk.jdk/Contents/Home
Bonus: Check your java path with echo $JAVA_HOME
Try to install Java version 11 (or lower) or 1.8. I changed to Java 1.8, and it solves my problem. Hadoop is not compatible with a java version higher than 11.
so this error actually comes if you have disturbed the java that was used during the installation example in my case i changed jdk in my sysytem i lost that web access to files But no worries can be fixed easly. so i put back the same jdk version and it was back and worked.
chake if your using a compatable jdk version for the hadoop your using then if configuration was fine before without changing anything it should work
suggested jdk 1.8
I was getting same problem but then I made a update to a file in hadoop/etc/httpfs-site.xml with the below code and the webhdfs started to work.
httpfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property> <name>dfs.namenode.name.dir</name>
<value>file:///home/hadoop/hadoopdata/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:///home/hadoop/hadoopdata/hdfs/datanode</value>
</property>
</configuration>
© 2022 - 2024 — McMap. All rights reserved.
hdfs-site.xml
, (which is true, by default). See the link. It mentions the properties. In any case, I've never actually used port 9870 to upload files. (My version of hadoop doesn't even have an upload feature there). Ambari or Hue are the popular web interfaces to do so. If you want to use webhdfs, it happens on port 50070 community.hortonworks.com/questions/139351/… – Gt