I have a Scala Spark application and want to invoke pySpark/python (pyspark_script.py) for further processing.
There are multiple resources to use Java/Scala code in Python but I am looking for scala->Pyspark
I explored Jython for Scala/Java to include Python code as follows:
PythonInterpreter.initialize(System.getProperties, properties, sysArgs)
val pi = new PythonInterpreter()
pi.execfile("path/to/pyscript/mypysparkscript.py")
I see error that says: "ImportError: No module named pyspark"
Is there any way on how Scala spark can talk to PYSpark with same sparkContext/session?