I am trying to configure spark-2.0.0-bin-hadoop2.7
on Ubuntu 16.04.1 LTS
. I have set
export JAVA_HOME=/home/marc/jdk1.8.0_101
export SCALA_HOME=/home/marc/scala-2.11.8
export SPARK_HOME=/home/marc/spark-2.0.0-bin-hadoop2.7
export PATH=$PATH:$SCALA_HOME/bin:$JAVA_HOME/bin
at the end of .bashrc
and also included in the start-all.sh
file from spark/sbin folder
when I type echo $JAVA_HOME
it gives me the correct path as /home/marc/jdk1.8.0_101
But when I call sbin/start-all.sh
It gives me the following error
localhost: failed to launch org.apache.spark.deploy.worker.Worker: localhost: JAVA_HOME is not set
I tried to follow similar topics, but I couldn't find a solution to the problem. Any help would be much appreciated.
root
user, probably. Similar question here. #33956135 – Creaturalapt-get
, and I believe that setsJAVA_HOME
without you needing to mess with it. – Creaturaldkpg -i jdk.deb
, or something like that – Creatural