how to integrate spark 2.2 with hadoop 3.1 manually?
Asked Answered
C

0

6

I want to use Spark version 2.2 and Hadoop latest version 3.1. Can I integrate Spark and Hadoop manually?

I have already installed Spark 2.2 with Hadoop 2.6 or later but I want to update Hadoop. Is it possible to find Hadoop directory in Spark with Hadoop folder?

I have downloaded Spark 2.2 without hadoop and hadoop 3.1.0. Both are running but when I configure spark.env.sh.template file with hadoop_home it's not working. Can anyone share proper configuration?

Thanks

Conjuration answered 19/4, 2018 at 11:29 Comment(4)
Did you rename spark.env.sh.template to spark.env.sh ?Ditty
no I will change now thanks @DittyConjuration
what was the outcome?Zeidman
I believe you need to wait until issues.apache.org/jira/browse/SPARK-23534 is implemented.Phycomycete

© 2022 - 2024 — McMap. All rights reserved.