I want to use Spark version 2.2 and Hadoop latest version 3.1. Can I integrate Spark and Hadoop manually?
I have already installed Spark 2.2 with Hadoop 2.6 or later but I want to update Hadoop. Is it possible to find Hadoop directory in Spark with Hadoop folder?
I have downloaded Spark 2.2 without hadoop and hadoop 3.1.0. Both are running but when I configure spark.env.sh.template file with hadoop_home it's not working. Can anyone share proper configuration?
Thanks