Problem: When I submit a job to my hadoop 2.2.0 cluster it doesn't show up in the job tracker but the job completes successfully. By this I can see the output and it is running correctly and prints output as it is running.
I have tried muliple options but the job tracker is not seeing the job. If I run a streaming job using the 2.2.0 hadoop it shows up in the task tracker but when I submit it via the hadoop-client api it does not show up in the job tracker. I am looking at the ui interface on port 8088 to verify the job
Environment OSX Mavericks, Java 1.6, Hadoop 2.2.0 single node cluster, Tomcat 7.0.47
Code
try {
configuration.set("fs.defaultFS", "hdfs://127.0.0.1:9000");
configuration.set("mapred.jobtracker.address", "localhost:9001");
Job job = createJob(configuration);
job.waitForCompletion(true);
} catch (Exception e) {
logger.log(Level.SEVERE, "Unable to execute job", e);
}
return null;
etc/hadoop/mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>
etc/hadoop/core-site.xml
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/tmp/hadoop-${user.name}</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
mapreduce.job.tracker
is a real Hadoop property. – Wyoming