How to set Spark application exit status?
Asked Answered
M

2

7

I'm writing a spark application and run it using spark-submit shell script (using yarn-cluster/yarn-client)

As I see now, exit code of spark-submit is decided according to the related yarn application - if SUCCEEDED status is 0, otherwise 1.

I want to have the option to return another exit code - for a state that my application succeeded with some errors.

Is it possible? to return different exit code from the application?

I tried to use System.exit() but didn't succeed...

Thanks.

Merrymerryandrew answered 31/1, 2017 at 15:24 Comment(2)
The answer posted in this question might help you, #45428645Marola
Are you still getting this error? What number are you passing in?Sadye
M
0

It is possible in client mode but not in cluster mode. You have a workaround for cluster mode.

My answer to this question should help you.

Mutism answered 3/2, 2017 at 7:46 Comment(1)
Read that and still don't understand how using yarn-client will help in that case? still we'l get the yarn status..Merrymerryandrew
A
0

If you run in cluster mode, spark-submit ends immediately returning submission ID as part of json, and do not waits for the application status. After this you can query the status by

 spark-submit --status [submission ID] 

If run in local or standalone modes you should be able to get this the exit code from spark-submit process.

Archiplasm answered 15/3, 2019 at 23:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.