Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError:
Asked Answered
H

1

1

I want to connect Kafka + Cassandra to the Spark 1.5.1.

The versions of the libraries:

scalaVersion := "2.10.6"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-streaming_2.10" % "1.5.1",
  "org.apache.spark" % "spark-streaming-kafka_2.10" % "1.5.1",
  "com.datastax.spark" % "spark-cassandra-connector_2.10" % "1.5.0-M2"
)

The initialization and use into app:

   val sparkConf = new SparkConf(true)
      .setMaster("local[2]")
      .setAppName("KafkaStreamToCassandraApp")
      .set("spark.executor.memory", "1g")
      .set("spark.cores.max", "1")
      .set("spark.cassandra.connection.host", "127.0.0.1")

Creates schema into Cassandra like this:

  CassandraConnector(sparkConf).withSessionDo { session =>
      session.execute(s"DROP KEYSPACE IF EXISTS kafka_streaming")
      session.execute(s"CREATE KEYSPACE IF NOT EXISTS kafka_streaming WITH REPLICATION = {'class': 'SimpleStrategy', 'replication_factor': 1 }")
      session.execute(s"CREATE TABLE IF NOT EXISTS kafka_streaming.wordcount (word TEXT PRIMARY KEY, count COUNTER)")
      session.execute(s"TRUNCATE kafka_streaming.wordcount")
    }

Also when prepared jar, create a few strategies:

assemblyMergeStrategy in assembly := {
  case PathList("com", "esotericsoftware", xs@_*) => MergeStrategy.last
  case PathList("com", "google", xs@_*) => MergeStrategy.first
  case PathList("org", "apache", xs@_*) => MergeStrategy.last
  case PathList("io", "netty", xs@_*) => MergeStrategy.last
  case PathList("com", "codahale", xs@_*) => MergeStrategy.last
  case PathList("META-INF", "io.netty.versions.properties") => MergeStrategy.first

I think the issue is connected with

  case PathList("com", "google", xs@_*) => MergeStrategy.first

Tied to use MergeStrategy.last.

Any ideas?

Got exception:

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z
        at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
        at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
        at com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
        at com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
        at com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
        at com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
        at com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
        at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
        at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
        at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
        at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
Hamulus answered 7/11, 2015 at 13:16 Comment(7)
This is your whole code???Taoism
Sorry, extended, could you please check itHamulus
How are you building your app? Are you using the assembly plugin?Taoism
Yes, sbt clean assemblyHamulus
It seems like you are using an incompatible version while merging on your com.google path-list.Taoism
Can you provide the imports of your code (or running code) so we can try to reproduce it?Politi
Added "com.google.guava" % "guava" % "19.0-rc2" manualy. have the same issue.Hamulus
P
0

Based on the error

 [error] /home/user/.ivy2/cache/org.apache.spark/spark-network-common_2.10/jars/spark-network-common_2.10-1.5.0.jar:com/google/common/base/Optional.class
 [error] /home/user/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.1.jar:com/google/common/base/Optional.class

It seems the last is the newest one, maybe you can put:

case PathList("com", "google", "common", "base", xs@_*) => MergeStrategy.last
Politi answered 7/11, 2015 at 20:44 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.