Our spark job is run on spark cluster, but the during spark executor trying to run spark job am seeing class not found exception, the thing is am very sure the jar which contains that class is loaded, so am not getting why this exception is coming, is classloader trying to load some different jar? if it is, is their way which jar it is looking?
How to find from which JAR , class loader is trying to load a specific class?
Asked Answered
can you share your spark-submit command? –
Scopas
and error logs too. –
Moriarty
Is it running on the classpath or in module or both? Any package on the classpath that collides with a package in a module is disregarded. –
Trainband
I guess you can use -verbose:class
to start your app and see verbose logs
How to Use Verbose Options in Java
[Opened C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
[Loaded java.lang.Object from C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
[Loaded java.io.Serializable from C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
[Loaded java.lang.Comparable from C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
[Loaded java.lang.CharSequence from C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
..............................................................................
..............................................................................
..............................................................................
[Loaded java.lang.Void from C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
[Loaded java.lang.Shutdown from C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
[Loaded java.lang.Shutdown$Lock from C:\Program Files\Java\jdk1.7.0_04\jre\lib\rt.jar]
© 2022 - 2024 — McMap. All rights reserved.