I was trying to execute sample basic sparkstreaming example in Scala IDE, but I am getting below error:
Error: Could not find or load main class org.test.spark.streamExample.
Could anyone help me to sort out this please.
I was trying to execute sample basic sparkstreaming example in Scala IDE, but I am getting below error:
Error: Could not find or load main class org.test.spark.streamExample.
Could anyone help me to sort out this please.
RightClick on your project and go to properties where you will find Scala Compiler, change target to jvm 1.7 based on your installation and also change Scala Installation dropdown based on version you have installed
This Error May occur for two reasons:
1. When you did not write the main method in the scala program
def main(args: Array[String]): Unit = {
println("TEst")
}
2. When you add unsupported jar to the build path of the scala project
Please check the above two scenarios.
If your scala file imported from external, check the very top of the code in Main file, just confirm that the package name matches yours.
The problem occurs mostly due to the incompatible version. Try to build spark with the version of Scala you are using.
Another solution to this problem is :
right click on the Project => Properties => Scala Compiler => Scala Installation => From the drop down select correct version of Scala.(Preferred a lower version of Scala. If not then install a lower version of Scala, then repeat the steps. )
Error: Could not find or load main class this same problem i have faced for more than 1 days, i have reinstalled IDE(IntelliJ) and i have changed JDK 11 to JDK 8 but nothing was working but finally i resolved it by adding these below two dependency
Solution:
we have to add both dependency spark-core and spark-sql in build.sbt Paackage
1).
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0"
2).
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"
i have copied these both dependency from these links
https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.12/2.4.0
https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.12/2.4.0
The main reason for such error is that some compilation errors may exists, it could be some any build path related issue or some code issue. Check the issue in problem tab .
Other than this , check Scala Library version in your POM and Scala Installation version in Scala compiler
If you are using IntelliJ IDE, then go to the drop down at right top of the IDE window and select "Edit Configuration".
In the left side of the pop-up screen, select "Scala-console", and in the right side of window, select your module name under "Use class path and SDK of module."
In my scenario :
Step 1 : Application is built from sbt
Step 2 : imported the application in eclipse.
Step 3 : When running the application got the following error
Solution
In the problems tab if we see the following error
sbt was using scala 2.11 and in eclipse we have scala set to 2.12. So if we set the eclipse scala compiler version to 2.11 it will work as shown below
Please check this possibility.
If you are from LINUX and using only a text editor and a terminal to use scalac
and scala
, make sure that you have set $SCALA_HOME
and $PATH
export SCALA_HOME=/usr/local/share/scala
export PATH=$PATH:$SCALA_HOME/bin
© 2022 - 2024 — McMap. All rights reserved.