I am trying to run on of my scala file from spark-shell. This file calling some other jar files which have already been loaded into spark-context
The problem is if something fails, it only prints the part of the stacktrace. Is there any way I can enable whole stacktrace?
myclassn: ClassifyFields : queryDb -> Problems extracting from DB
at myclass.queryDb(ClassifyFields.java:231)
at myclass.getColumnsWithKeys(ClassifyFields.java:258)
... 78 elided
try/catch
block and inside thecatch
block, write the stacktrace to a file.. – Ingesta