Installation Info :-
Hadoop version :- 2.6.5 Spark Version :- 2.1.0 And Kerberos
I am trying to get the spark context in yarn mode with kerberos authentication but below exception.
Code :-
public static void main(String[] args) {
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://localhost:9000");
conf.set("hadoop.security.authentication", "kerberos");
conf.set("hadoop.security.authorization", "true");
UserGroupInformation.setConfiguration(conf);
System.out.println("Security enabled " + UserGroupInformation.isSecurityEnabled());
SparkConf sparkConf = new SparkConf().setAppName("Spark shell").setMaster("yarn");
SparkSession sparkSession = SparkSession.builder().config(sparkConf).getOrCreate();
System.out.println(sparkSession.version() + " : " + sparkSession.sparkContext());
}
Its prints security is enabled like:- Security enabled true
Exception :-
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.yarn.ipc.RPCUtil.instantiateException(RPCUtil.java:53)
Please advise me to resolve this error.
HADOOP_CONF_DIR
so that Hadoop libs find their XML config files. – Theodora