I recently started working with Spark Scala, HDFS, sbt and Livy. Currently I tried to create livy batch.
Warning: Skip remote jar hdfs://localhost:9001/jar/project.jar.
java.lang.ClassNotFoundException: SimpleApp
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:225)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:686)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
This is the error statement, showing in livy batch log.
My spark-submit command is working perfectly fine for local .jar file.
spark-submit --class "SimpleApp" --master local target/scala-2.11/simple-project_2.11-1.0.jar
But same for livy (in cURL) it is throwing error.
"requirement failed: Local path /target/scala-2.11/simple-project_2.11-1.0.jar cannot be added to user sessions."
So, I shift .jar file in hdfs. My new code for livy is -
curl -X POST --data '{
"file": "/jar/project.jar",
"className": "SimpleApp",
"args": ["ddd"]
}'
-H
"Content-Type: application/json"
http://server:8998/batches
This is throwing error which is mention above.
Please let me know, where am I wrong?
Thanks in advance!