Hadoop Error starting ResourceManager and NodeManager
Asked Answered
E

4

7

I'm trying to setup Hadoop3-alpha3 with a Single Node Cluster (Psuedo-distributed) and using the apache guide to do so. I've tried running the example MapReduce job but every time the connection is refused. After running sbin/start-all.sh I've been seeing these exceptions in the ResourceManager log (and similarly in the NodeManager log):

xxxx-xx-xx xx:xx:xx,xxx INFO org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Error when creating PropertyDescriptor for public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)! Ignoring this property.
xxxx-xx-xx xx:xx:xx,xxx DEBUG org.apache.commons.beanutils.FluentPropertyBeanIntrospector: Exception is:
java.beans.IntrospectionException: bad write method arg count: public final void org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)
    at java.desktop/java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:696)
    at java.desktop/java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:356)
    at java.desktop/java.beans.PropertyDescriptor.<init>(PropertyDescriptor.java:142)
    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)
    at org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)
    at org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)
    at org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)
    at org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954)
    at org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.java:1478)
    at org.apache.commons.configuration2.beanutils.BeanHelper.isPropertyWriteable(BeanHelper.java:521)
    at org.apache.commons.configuration2.beanutils.BeanHelper.initProperty(BeanHelper.java:357)
    at org.apache.commons.configuration2.beanutils.BeanHelper.initBeanProperties(BeanHelper.java:273)
    at org.apache.commons.configuration2.beanutils.BeanHelper.initBean(BeanHelper.java:192)
    at org.apache.commons.configuration2.beanutils.BeanHelper$BeanCreationContextImpl.initBean(BeanHelper.java:669)
    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.initBeanInstance(DefaultBeanFactory.java:162)
    at org.apache.commons.configuration2.beanutils.DefaultBeanFactory.createBean(DefaultBeanFactory.java:116)
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:459)
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:479)
    at org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:492)
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResultInstance(BasicConfigurationBuilder.java:447)
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResult(BasicConfigurationBuilder.java:417)
    at org.apache.commons.configuration2.builder.BasicConfigurationBuilder.getConfiguration(BasicConfigurationBuilder.java:285)
    at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:119)
    at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:98)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:478)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163)
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62)
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceInit(ResourceManager.java:678)
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.createAndInitActiveServices(ResourceManager.java:1129)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:315)
    at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1407)

And then later in the file:

xxxx-xx-xx xx:xx:xx,xxx FATAL org.apache.hadoop.yarn.server.resourcemanager.ResourceManager: Error starting ResourceManager
java.lang.ExceptionInInitializerError
    at com.google.inject.internal.cglib.reflect.$FastClassEmitter.<init>(FastClassEmitter.java:67)
    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.generateClass(FastClass.java:72)
    at com.google.inject.internal.cglib.core.$DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
    at com.google.inject.internal.cglib.core.$AbstractClassGenerator.create(AbstractClassGenerator.java:216)
    at com.google.inject.internal.cglib.reflect.$FastClass$Generator.create(FastClass.java:64)
    at com.google.inject.internal.BytecodeGen.newFastClass(BytecodeGen.java:204)
    at com.google.inject.internal.ProviderMethod$FastClassProviderMethod.<init>(ProviderMethod.java:256)
    at com.google.inject.internal.ProviderMethod.create(ProviderMethod.java:71)
    at com.google.inject.internal.ProviderMethodsModule.createProviderMethod(ProviderMethodsModule.java:275)
    at com.google.inject.internal.ProviderMethodsModule.getProviderMethods(ProviderMethodsModule.java:144)
    at com.google.inject.internal.ProviderMethodsModule.configure(ProviderMethodsModule.java:123)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:349)
    at com.google.inject.AbstractModule.install(AbstractModule.java:122)
    at com.google.inject.servlet.ServletModule.configure(ServletModule.java:52)
    at com.google.inject.AbstractModule.configure(AbstractModule.java:62)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:340)
    at com.google.inject.spi.Elements.getElements(Elements.java:110)
    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:138)
    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:104)
    at com.google.inject.Guice.createInjector(Guice.java:96)
    at com.google.inject.Guice.createInjector(Guice.java:73)
    at com.google.inject.Guice.createInjector(Guice.java:62)
    at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:332)
    at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:377)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1116)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1218)
    at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
    at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:1408)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make protected final java.lang.Class java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain) throws java.lang.ClassFormatError accessible: module java.base does not "opens java.lang" to unnamed module @173f73e7
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:337)
    at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:281)
    at java.base/java.lang.reflect.Method.checkCanSetAccessible(Method.java:197)
    at java.base/java.lang.reflect.Method.setAccessible(Method.java:191)
    at com.google.inject.internal.cglib.core.$ReflectUtils$2.run(ReflectUtils.java:56)
    at java.base/java.security.AccessController.doPrivileged(Native Method)
    at com.google.inject.internal.cglib.core.$ReflectUtils.<clinit>(ReflectUtils.java:46)
    ... 29 more

For reference my core-site.xml:

<configuration>
    <property>
        <name>fs.default.name</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

hdfs-site.xml:

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
</configuration>

mapred-site.xml:

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

and yarn-site.xml:

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.env-whitelist</name>
        <value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME</value>
    </property>
</configuration>

I have no idea what is causing these exceptions, any help with them would be helpful.

Edit: Added hadoop-env.sh:

export JAVA_HOME=/usr/local/jdk-9
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_OS_TYPE=${HADOOP_OS_TYPE:-$(uname -s)}
case ${HADOOP_OS_TYPE} in
  Darwin*)
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= "
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.kdc= "
    export HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf= "
  ;;
esac
export HADOOP_ROOT_LOGGER=DEBUG,console
export HADOOP_DAEMON_ROOT_LOGGER=DEBUG,RFA
Eris answered 8/6, 2017 at 5:55 Comment(7)
What is the Java version and operating system that you are using ?Convergent
Lubuntu 17.04, Java build 9-ea+171Eris
Please provide content of hadoop-env.sh as wellConvergent
@SouravGulati I have added hadoop-env.sh as well, can you explain what I'm doing wrong?Eris
Could you please try using Java 8 or Java 7. Don't use Java 9Convergent
+1. Java 9 support is not there yet. https://issues.apache.org/jira/browse/HADOOP-11123 Hadoop on Java 9Stonge
Ah, thank you for that.Eris
E
12

At mentioned by @tk421 in the comments. Java 9 is not compatible with Hadoop 3 (and possibly any hadoop version) yet.

https://issues.apache.org/jira/browse/HADOOP-11123

I've changed to Java 8.181 and both are starting up now:

hadoop@hadoop:/usr/local/hadoop$ sbin/start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as hadoop in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [localhost]
Starting datanodes
Starting secondary namenodes [hadoop]
Starting resourcemanager
Starting nodemanagers
hadoop@hadoop:/usr/local/hadoop$ jps
8756 SecondaryNameNode
8389 NameNode
9173 NodeManager
9030 ResourceManager
8535 DataNode
9515 Jps
Eris answered 10/6, 2017 at 1:19 Comment(0)
C
2

I want to share my observations. OpenJDK 17 was used to interact with Hadoop.

I thought the newer the Java version, the better. I was very wrong, I had to switch to OpenJDK 8. So, how do we fix the problem?

  1. You need to uninstall the previous version of Java. There are several removal options.
  • Deleting only OpenJDK: $ sudo apt-get remove openjdk*

  • Deleting OpenJDK along with dependencies: $ sudo apt-get remove --auto-remove openjdk*

  • Deleting OpenJDK and its configuration files: $ sudo apt-get purge openjdk*

  • Deleting OpenJDK along with dependencies and it’s configuration files: $ sudo apt-get purge --auto-remove openjdk*

As for me, I used the latter option.

  1. You need to install OpenJDK 8.
  • Installing OpenJDK: $ sudo apt install openjdk-8-jdk -y

After the installation is complete, you can check the Java version: $ java -version; javac -version

  1. You need to edit the path to the JAVA_HOME variable. To do this, you need to open hadoop-env.sh
  • To open a file hadoop-env.sh, you can use the command:

    sudo nano $HADOOP_HOME/etc/hadoop/hadoop-env.sh

where $HADOOP_HOME is the location of your Hadoop (for example, /home/hdoop/hadoop-3.2.4).

  • The contents of the JAVA_HOME variable looks like this: export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64 . Of course, that all depends on your Java location.
  1. Go to the catalog hadoop-3.2.4/sbin.

Next, you need to stop the daemons on all nodes of the cluster: ./stop-all.sh

  1. Running NameNode and DataNode: ./start-dfs.sh

  2. Running YARN resource and NodeManager: ./start-yarn.sh

  3. Check if all daemons are active and running as Java processes: jps. The resulting list should look (approximately) as follows:

    33706 SecondaryNameNode

    33330 NameNode

    34049 NodeManager

    33900 ResourceManager

    33482 DataNode

    34410 Jps

  4. Hadoop setup is DONE!

P.S. I hope my answer will be useful. I tried to cover all the details in my answer. I wish you all success.

Congruous answered 9/8, 2022 at 12:6 Comment(0)
D
1

my problem is that I used java11 to cooperate with hadoop.

so what i do is

1.rm /Library/Java/*

2.download java8 from https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

3.install java8jdk and

4.fix the JAVA_HOME in hadoop-env.sh

5.stop-all.sh

6.start-dfs.sh

7.start-yarn.sh

Dynatron answered 6/11, 2018 at 7:52 Comment(0)
G
1

yup as others have pointed out, java version might be the issue
was following this page where they clearly link here for compatible java versions
so, i was trying with java 17 and start-yarn.sh was failing, but with java 11, start-yarn.sh worked

Gaul answered 20/8, 2023 at 7:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.