Is it possible to use json4s 3.2.11 with Spark 1.3.0?
Asked Answered
C

3

5

Spark has a dependency on json4s 3.2.10, but this version has several bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to build.sbt and everything compiled fine. But when I spark-submit my JAR it provides me with 3.2.10.

build.sbt

import sbt.Keys._

name := "sparkapp"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" % "provided"

libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`

plugins.sbt

logLevel := Level.Warn

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

App1.scala

import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.{Logging, SparkConf, SparkContext}
import org.apache.spark.SparkContext._

object App1 extends Logging {
  def main(args: Array[String]) = {
    val conf = new SparkConf().setAppName("App1")
    val sc = new SparkContext(conf)
    println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
  }
}

sbt 0.13.7 + sbt-assembly 0.13.0 Scala 2.10.4

Is there a way to force 3.2.11 version usage?

Crimpy answered 23/3, 2015 at 17:35 Comment(0)
C
4

We ran into a problem similar to the one Necro describes, but downgrading from 3.2.11 to 3.2.10 when building the assembly jar did not resolve it. We ended up solving it (using Spark 1.3.1) by shading the 3.2.11 version in the job assembly jar:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("org.json4s.**" -> "shaded.json4s.@1").inAll
)
Clemens answered 25/9, 2015 at 16:21 Comment(0)
C
3

I asked the same question in the Spark User Mailing List, and got two answers how to make it work:

  1. Use spark.driver.userClassPathFirst=true and spark.executor.userClassPathFirst=true options, but it works only in Spark 1.3 and probably will require some other modifications like excluding Scala classes from your build.

  2. Rebuild Spark with json4s 3.2.11 version (you can change it in core/pom.xml)

Both work fine, I prefered the second one.

Crimpy answered 29/3, 2015 at 9:56 Comment(2)
How do you rebuild Spark with json 3.2.11?Seamaid
Just replace json4s version in ./core/pom.xml 3.2.10 -> 3.2.11 and then follow this spark.apache.org/docs/latest/building-spark.html instruction to build it using Maven.Crimpy
V
2

This is not an answer to your question but this came up when searching for my problem. I was getting a NoSuchMethod exception in formats.emptyValueStrategy.replaceEmpty(value) in json4s's 'render'. The reason was I was building with 3.2.11 but Spark was linking 3.2.10. I downgraded to 3.2.10 and my problem went away. Your question helped me understand what was going on (that Spark was linking a conflicting version of json4s) and I was able to resolve the problem, so thanks.

Veta answered 24/3, 2015 at 21:2 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.