I just installed pyspark 2.2.0 using conda (using python v3.6 on windows 7 64bit, java v1.8)
$conda install pyspark
It downloaded and seemed to install correctly with no errors. Now when I run pyspark
on the command line, it just tells me "The system cannot find the path specified."
$pyspark
The system cannot find the path specified.
The system cannot find the path specified.
I tried including the pyspark path directory in my PATH environment variables, but that still didn't seem to work, but maybe I am giving the wrong path? Can anyone please advise. Does the Java path need to be specified in PATH environment variables or something? Thanks