maven artifactId hadoop 2.2.0 for hadoop-core
Asked Answered
M

3

3

I am migrating my application from hadoop 1.0.3 to hadoop 2.2.0 and maven build had hadoop-core marked as dependency. Since hadoop-core is not present for hadoop 2.2.0. I tried replacing it with hadoop-client and hadoop-common but I am still getting this error for ant.filter. Can anybody please suggest which artifact to use?

previous config :
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.0.3</version>
</dependency>

New Config:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.2.0</version>
</dependency>

Error:

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project event: Compilation failure: Compilation failure:

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[27,36] package org.apache.tools.ant.filters does not exist

[ERROR] /opt/teamcity/buildAgent/work/c670ebea1992ec2f/event/src/main/java/com/intel/event/EventContext.java:[180,59] cannot find symbol

[ERROR] symbol: class StringInputStream

[ERROR] location: class com.intel.event.EventContext
Meshwork answered 12/3, 2014 at 23:55 Comment(0)
R
6

We mainly depend on hdfs api for our application. When we migrated to hadoop 2.X, we were surprised to see the changes in dependencies. We started adding dependencies one at a time. Today we depend on the following core libraries.

hadoop-annotations-2.2.0
hadoop-auth-2.2.0
hadoop-common-2.2.0
hadoop-hdfs-2.2.0
hadoop-mapreduce-client-core-2.2.0

In addition to these we depend on test libraries too. Based on your needs, you may want to include hadoop-hdfs and hadoop-mapreduce-client to the dependencies along with hadoop-common.

Reinold answered 13/3, 2014 at 4:22 Comment(0)
H
0

Try with these artifacts, word fine on my sample project wordcount

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>2.2.0</version>
</dependency>

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-core</artifactId>
    <version>1.2.1</version>
</dependency>
Hallee answered 13/3, 2014 at 0:45 Comment(2)
Thanks, but will version 1.2.1 in hadoop-core work? I am trying to migrate to 2.2.0. Also in my question there was an old artifact as well in the new config section I have corrected it now.Meshwork
hadoop-core and hadoop-common do not follow the same version numbers.Hallee
C
0

Maven dependencies can be got from this link. As far as hadoop-core dependies goes, hadoop-core was the name for hadoop 1.X and just renaming the version to 2.X wont help. Also in a hadoop 2.X project using the hadoop 1.X dependency gives an error like

Caused by: org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4

Thus it is suggested not to use it. I have been using the following dependencies in my hadoop

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-core</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-mapreduce-client-common</artifactId>
    <version>2.7.1</version>
</dependency>
<dependency>

You can try these.

Chacma answered 12/9, 2015 at 10:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.