Running Spark on Linux : $JAVA_HOME not set error
Asked Answered
A

3

12

I am trying to configure spark-2.0.0-bin-hadoop2.7 on Ubuntu 16.04.1 LTS. I have set

export JAVA_HOME=/home/marc/jdk1.8.0_101
export SCALA_HOME=/home/marc/scala-2.11.8
export SPARK_HOME=/home/marc/spark-2.0.0-bin-hadoop2.7
export PATH=$PATH:$SCALA_HOME/bin:$JAVA_HOME/bin

at the end of .bashrc and also included in the start-all.sh file from spark/sbin folder

when I type echo $JAVA_HOME it gives me the correct path as /home/marc/jdk1.8.0_101

But when I call sbin/start-all.sh

It gives me the following error

localhost: failed to launch org.apache.spark.deploy.worker.Worker: localhost: JAVA_HOME is not set

I tried to follow similar topics, but I couldn't find a solution to the problem. Any help would be much appreciated.

Appease answered 3/8, 2016 at 15:23 Comment(11)
Are you running just a single node worker?Creatural
And what user account did you add those variables for?Creatural
There is only one user account on the system which belongs to me, and yes I am running a single node workerAppease
You are forgetting the root user, probably. Similar question here. #33956135Creatural
@cricket_007 I added the code as instructed in the link you showed me but I am still getting the same errorAppease
Just curious - Why did you install Java to your home folder?Creatural
well I was just following a tutorial of setting up apache zeppelin with spark. And for spark setup scala and java were required.Appease
Right, I understand that, but you are on Ubuntu, so should install Java from apt-get, and I believe that sets JAVA_HOME without you needing to mess with it.Creatural
hmm yes but my internet speed was quite low and I am running linux in a virtual box and had jdk previously downloaded on my windows instance hence I preferred to get it from thereAppease
The windows Java binaries can't be used a linux installation... Unless, you mean you had the Linux binary package downloaded? If that is the case, then you can still install with dkpg -i jdk.deb, or something like thatCreatural
yes I had the linux binary package downloaded I will try to install it properly to see if it fixes the issue.Appease
M
10

You need to modify the file named 'spark-config.sh' in the 'sbin'. Add your JAVA_HOME in this file, then everything will be OK.

Manley answered 16/12, 2018 at 5:32 Comment(0)
S
21

Try installing Java at your computer:

First, check if it is there:

java -version

If not installed:

sudo apt-get update
sudo apt-get install openjdk-8-jdk

This should fix the problem.

Scandium answered 11/11, 2021 at 23:15 Comment(2)
this does not fix the issue.Forgetmenot
It did fix for me, as well as for loads of other people, it appears.Scandium
M
10

You need to modify the file named 'spark-config.sh' in the 'sbin'. Add your JAVA_HOME in this file, then everything will be OK.

Manley answered 16/12, 2018 at 5:32 Comment(0)
R
1

Please try to config JAVA_HOME in the spark_env.sh file.

Rollicking answered 26/11, 2020 at 1:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.