I have the following piece of code in Spark:
rdd
.map(processFunction(_))
.saveToCassandra("keyspace", "tableName")
Where
def processFunction(src: String): Seq[Any] =
src match {
case "a" => List(A("a", 123112, "b"), A("b", 142342, "c"))
case "b" => List(B("d", 12312, "e", "f"), B("g", 12312, "h", "i"))
}
Where:
case class A(entity: String, time: Long, value: String)
case class B(entity: String, time: Long, value1: String, value2: String)
saveToCassandra
expects a collection of objects and using Seq[Any]
as the return type to contain both Seq[A]
and Seq[B]
breaks saveToCassandra
with the exception - scala.ScalaReflectionException: <none>
is not a term. What could be the reason for this behaviour?