Datastax Cassandra Driver throwing CodecNotFoundException
S

2

10

The exact Exception is as follows

com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [varchar <-> java.math.BigDecimal]

These are the versions of Software I am using Spark 1.5 Datastax-cassandra 3.2.1 CDH 5.5.1

The code I am trying to execute is a Spark program using the java api and it basically reads data (csv's) from hdfs and loads it into cassandra tables . I am using the spark-cassandra-connector. I had a lot of issues regarding the google s guava library conflict initially which I was able to resolve by shading the guava library and building a snap-shot jar with all the dependencies.

However I was able to load data for some files but for some files I get the Codec Exception . When I researched on this issue I got these following threads on the same issue.

https://groups.google.com/a/lists.datastax.com/forum/#!topic/java-driver-user/yZyaOQ-wazk

https://groups.google.com/a/lists.datastax.com/forum/#!topic/java-driver-user/yZyaOQ-wazk

After going through these discussion what I understand is either it is a wrong version of the cassandra-driver I am using . Or there is still a class path issue related to the guava library as cassandra 3.0 and later versions use guava 16.0.1 and the discussions above say that there might be a lower version of the guava present in the class path .

Here is pom.xml file

 <dependencies>
 <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.0</version> 
</dependency>
<dependency>
  <groupId>junit</groupId>
  <artifactId>junit</artifactId>
  <version>3.8.1</version>
  <scope>test</scope>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
<dependency>
<groupId>org.apache.cassandra</groupId>
<artifactId>cassandra-clientutil</artifactId>
<version>3.2.1</version>
</dependency>

</dependencies>
  <build>
<plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-shade-plugin</artifactId>
        <version>2.3</version>
        <executions>
            <execution>
                <phase>package</phase>
                <goals>
                    <goal>shade</goal>
                </goals>
                <configuration>
                 <filters>
    <filter>
        <artifact>*:*</artifact>
        <excludes>
            <exclude>META-INF/*.SF</exclude>
            <exclude>META-INF/*.DSA</exclude>
            <exclude>META-INF/*.RSA</exclude>
        </excludes>
    </filter>
</filters>
                    <relocations>
                        <relocation>
                            <pattern>com.google</pattern>
                            <shadedPattern>com.pointcross.shaded.google</shadedPattern>
                        </relocation>

                    </relocations>
                    <minimizeJar>false</minimizeJar>
                    <shadedArtifactAttached>true</shadedArtifactAttached>
                </configuration>
            </execution>
        </executions>
    </plugin>
</plugins>
</build>
</project>

and these are the dependencies that were downloaded using the above pom

spark-core_2.10-1.5.0.jar
spark-cassandra-connector-   java_2.10-1.5.0-M3.jar
spark-cassandra-connector_2.10-1.5.0-M3.jar
spark-repl_2.10-1.5.1.jar
spark-bagel_2.10-1.5.1.jar
spark-mllib_2.10-1.5.1.jar
spark-streaming_2.10-1.5.1.jar
spark-graphx_2.10-1.5.1.jar
guava-16.0.1.jar
cassandra-clientutil-3.2.1.jar
cassandra-driver-core-3.0.0-alpha4.jar

Above are some of the main dependencies on in my snap-shot jar.

Y is the CodecNotFoundException ? Is it because of the class path (guava) ? or cassandra-driver (cassandra-driver-core-3.0.0-alpha4.jar for datastax cassandra 3.2.1) or because of the code .

Another point is all the dates I am inserting to columns who's data type is timestamp .

Also when I do a spark-submit I see the class path in the logs , There are other guava versions which are under the hadoop libs . R these causing the problem ?

How do we specify the a user-specific class path while do a spark-submit. Will that help ?

Would be glad to get some points on these. Thanks

Following is the stacktrace

com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [timestamp <-> java.lang.String]
at com.datastax.driver.core.CodecRegistry.notFound(CodecRegistry.java:689)
at com.datastax.driver.core.CodecRegistry.createCodec(CodecRegistry.java:550)
at com.datastax.driver.core.CodecRegistry.findCodec(CodecRegistry.java:530)
at com.datastax.driver.core.CodecRegistry.codecFor(CodecRegistry.java:485)
at com.datastax.driver.core.AbstractGettableByIndexData.codecFor(AbstractGettableByIndexData.java:85)
at com.datastax.driver.core.BoundStatement.bind(BoundStatement.java:198)
at com.datastax.driver.core.DefaultPreparedStatement.bind(DefaultPreparedStatement.java:126)
at com.cassandra.test.LoadDataToCassandra$1.call(LoadDataToCassandra.java:223)
at com.cassandra.test.LoadDataToCassandra$1.call(LoadDataToCassandra.java:1)
at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1027)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1555)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1121)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1850)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:88)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

I also got

com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [Math.BigDecimal <-> java.lang.String]
Schear answered 2/6, 2016 at 10:5 Comment(2)
Can you share the stack trace?Poussin
I have shared the stacktraceSchear
P
15

When you call bind(params...) on a PreparedStatement the driver expects you to provide values w/ java types that map to the cql types.

This error ([timestamp <-> java.lang.String]) is telling you that there is no such Codec registered that maps the java String to a cql timestamp. In the java driver, the timestamp type maps to java.util.Date. So you have 2 options here:

  1. Where the column being bound is for a timestamp, provide a Date-typed value instead of a String.
  2. Create a codec that maps timestamp <-> String. To do so you could create sub class of MappingCodec as described on the documentation site, that maps String to timestamp:
public class TimestampAsStringCodec extends MappingCodec<String, Date> {
    public TimestampAsStringCodec() { super(TypeCodec.timestamp(), String.class); }

    @Override
    protected Date serialize(String value) { ... }

    @Override
    protected String deserialize(Date value) { ... }
}

You then would need to register the Codec:

cluster.getConfiguration().getCodecRegistry()
    .register(new TimestampAsStringCodec());
Poussin answered 6/6, 2016 at 13:41 Comment(4)
Thanks I provide a date type value now . One more thing is that is the cassandra-driver-core-3.0.0-alpha4.jar stable enough ? When I try other drivers like cassandra-driver-core-3.0.0 or 3.0.1 I get the following exception DEFAULT_SSL_CIPHER_SUITES error.Schear
As far as I recall, there weren't any major bugs in alpha4, but that being said it is an alpha and using a GA released version (3.0.2 preferably) would be a better choice. Can you create a new question with your DEFAULT_SSL_CIPHER_SUITES error? One change that was made in the 3.0 was to not require cipher suites to be explicitly stated. I'm guessing it is something to do with that.Poussin
I have my cassandra table column type as timeuuid. Can you suggest what should be data type in Datastax Mapper classDulcle
@VinodJayachandran You would extend MappingCodec<String, UUID> and call super(TypeCodec.timeUUID(), String.class); in the constructor.Whorton
T
-1

Better solution is provided here

The correct mappings that the driver offers out of the box for temporal types are:

    DATE      <-> com.datastax.driver.core.LocalDate : use getDate()
Thermoelectricity answered 18/10, 2016 at 6:41 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.