Spark 1.5.1, Cassandra Connector 1.5.0-M2, Cassandra 2.1, Scala 2.10, NoSuchMethodError guava dependency
Asked Answered
G

1

0

New to the Spark environment (and fairly new to Maven) so I'm struggling with how to send the dependencies I need correctly.

It looks like Spark 1.5.1 has a guava-14.0.1 dependency which it tries to use and the isPrimitive was added in 15+. What's the correct way to ensure my uber-jar wins? I've tried spark.executor.extraClassPath in my spark-defaults.conf to no avail.

Duplicate to this [question]:Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError: but for Maven essentially (don't have rep to comment yet)

Stripped down my dependencies to this:

    <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>18.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-compress</artifactId>
        <version>1.10</version>
    </dependency>
    <dependency>
        <groupId>com.esotericsoftware.kryo</groupId>
        <artifactId>kryo</artifactId>
        <version>2.21</version>
    </dependency>
    <dependency>
        <groupId>org.objenesis</groupId>
        <artifactId>objenesis</artifactId>
        <version>2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.5.0</version>
        <exclusions>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
            </exclusion>
            <exclusion>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.5.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
        <version>1.5.0-M2</version>
    </dependency>

Shaded my JAR with all the dependencies using this:

       <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes>
                                <exclude>org.apache.hadoop:*</exclude>
                                <exclude>org.apache.hbase:*</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                            <filter>
                                <artifact>org.apache.spark:spark-network-common_2.10</artifact>
                                <excludes>
                                    <exclude>com.google.common.base.*</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <transformers>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                <!-- merge multiple reference.conf files into one -->
                                <resource>reference.conf</resource>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>

Here's my awesome explosion when I run

./spark-submit --master local --class <my main class> <my shaded jar>

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z
at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
at com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
at com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
at com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
at com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
at com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
Grisby answered 17/11, 2015 at 18:26 Comment(0)
G
2

Fixed my dependency issue by hard-including the guava jars I needed in /conf/spark-defaults.conf.

spark.driver.extraClassPath /home/osboxes/Packages/guava-18.0.jar
spark.executor.extraClassPath /home/osboxes/Packages/guava-18.0.jar
Grisby answered 18/11, 2015 at 14:16 Comment(2)
+1 Thanks , But after the fix , I landed up with a new error Exception in thread "main" java.lang.IllegalAccessError: tried to access method com.google.common.collect.MapMaker.softValues()Lcom/google/common/collect/MapMaker; from class org.apache.spark.SparkEnv . Did you face this issue ?Veta
much easier than shading the .jarMenedez

© 2022 - 2024 — McMap. All rights reserved.