Cannot start spark-shell
Asked Answered
M

3

8

I am using Spark 1.4.1. I can use spark-submit without problem. But when I ran ~/spark/bin/spark-shell

I got the error below I have configured SPARK_HOME and JAVA_HOME. However, It was OK with Spark 1.2

15/10/08 02:40:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.

Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.AssertionError: assertion failed: null
        at scala.Predef$.assert(Predef.scala:179)
        at org.apache.spark.repl.SparkIMain.initializeSynchronous(SparkIMain.scala:247)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Malposition answered 8/10, 2015 at 2:45 Comment(3)
where have you set SPARK_HOME? in your .bashrc? cause the error you got is due to SPARK_HOME is not set so spark-shell tries to find it from the dirnameUnpen
What should I set my SPARK_HOME to? Should it be set to export SPARK_HOME=/usr/local/Cellar/apache-spark/2.2.0/bin?Karlsruhe
I don't think the issue is SPARK_HOME. An incorrect SPARK_HOME will cause the spark-shell script to fail to find spark-submit. However, I'm seeing the same error on my machine both when I ensure that the SPARK_HOME and when I call "spark-submit --class org.apache.spark.repl.Main" directly.Aggrieved
S
1

I was having the same problem running spark but I found it was my fault for not configuring scala properly. Make sure you have Java, Scala and sbt installed and Spark is built:

Edit your .bashrc file vim .bashrc

Set your env variables:

export JAVA_HOME=/usr/lib/jvm/java-7-oracle
export PATH=$JAVA_HOME:$PATH

export SCALA_HOME=/usr/local/src/scala/scala-2.11.5
export PATH=$SCALA_HOME/bin:$PATH

export SPARK_HOME=/usr/local/src/apache/spark.2.0.0/spark
export PATH=$SPARK_HOME/bin:$PATH

Source your settings . .bashrc

check scala scala -version

make sure the repl starts scala

if your repel starts try and start your spark shell again. ./path/to/spark/bin/spark-shell

you should get the spark repl

Stenograph answered 13/9, 2016 at 7:48 Comment(0)
A
1

You could try running

spark-shell -usejavacp

It didn't work for me, but it did work for someone in the descriptions of Spark Issue 18778.

Aggrieved answered 5/10, 2017 at 22:14 Comment(0)
T
0

Have you installed scala and sbt?
The log said it didn't find the main class.

Tilt answered 5/11, 2015 at 2:41 Comment(1)
Do you think it is caused by sbt and scala are not put in PATH?Malposition

© 2022 - 2024 — McMap. All rights reserved.