JSON4s can't find constructor w/spark
Asked Answered
C

0

3

I've run into an issue with attempting to parse json in my spark job. I'm using spark 1.1.0, json4s, and the Cassandra Spark Connector, with DSE 4.6. The exception thrown is:

org.json4s.package$MappingException: Can't find constructor for BrowserData      org.json4s.reflect.ScalaSigReader$.readConstructor(ScalaSigReader.scala:27)
   org.json4s.reflect.Reflector$ClassDescriptorBuilder.ctorParamType(Reflector.scala:108)
        org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$6.apply(Reflector.scala:98)
        org.json4s.reflect.Reflector$ClassDescriptorBuilder$$anonfun$6.apply(Reflector.scala:95)
        scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)

My code looks like this:

case class BrowserData(navigatorObjectData: Option[NavigatorObjectData],
                       flash_version: Option[FlashVersion],
                       viewport: Option[Viewport],
                       performanceData: Option[PerformanceData])

.... other case classes

def parseJson(b: Option[String]): Option[String] = {
    implicit val formats = DefaultFormats
      for {
        browserDataStr <- b
        browserData = parse(browserDataStr).extract[BrowserData]
        navObject <- browserData.navigatorObjectData
        userAgent <- navObject.userAgent
      } yield (userAgent)
  }

def getJavascriptUa(rows: Iterable[com.datastax.spark.connector.CassandraRow]): Option[String] = {
  implicit val formats = DefaultFormats
  rows.collectFirst { case r if r.getStringOption("browser_data").isDefined  =>
    parseJson(r.getStringOption("browser_data"))
  }.flatten
}

def getRequestUa(rows: Iterable[com.datastax.spark.connector.CassandraRow]): Option[String] = {
  rows.collectFirst { case r if r.getStringOption("ua").isDefined  =>
    r.getStringOption("ua")
  }.flatten
}

def checkUa(rows: Iterable[com.datastax.spark.connector.CassandraRow], sessionId: String): Option[Boolean] = {
  for {
    jsUa <- getJavascriptUa(rows)
    reqUa <- getRequestUa(rows)
  } yield (jsUa == reqUa)
}

def run(name: String) = {
  val rdd = sc.cassandraTable("beehive", name).groupBy(r => r.getString("session_id"))
  val counts = rdd.map(r => (checkUa(r._2, r._1)))
  counts
}

I use :load to load the file into the REPL, and then call the run function. The failure is happening in the parseJson function, as far as I can tell. I've tried a variety of things to try to get this to work. From similar posts, I've made sure my case classes are in the top level in the file. I've tried compiling just the case class definitions into a jar, and including the jar in like this: /usr/bin/dse spark --jars case_classes.jar

I've tried adding them to the conf like this: sc.getConf.setJars(Seq("/home/ubuntu/case_classes.jar"))

And still the same error. Should I compile all of my code into a jar? Is this a spark issue or a JSON4s issue? Any help at all appreciated.

Colombia answered 16/4, 2015 at 6:2 Comment(3)
does you data match the case class structure? Does it work in isolation? Do you have a unit test for it?Mathildamathilde
Is your case class top level? github.com/json4s/json4s/issues/143Meteoric
I ran into this today also, turned out to be a difference between my case class properties and the JSON properties.Prochronism

© 2022 - 2024 — McMap. All rights reserved.