I use Spark 1.6.0 on my VM, Cloudera machine.
I'm trying to enter some data into Hive table from Spark shell. To do that, I am trying to use SparkSession. But the below import is not working.
scala> import org.apache.spark.sql.SparkSession
<console>:33: error: object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
And without that, I cannot execute this statement:
val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
<console>:33: error: not found: value SparkSession
val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
Can anyone tell me what mistake am I doing here ?