UnsatisfiedLinkError on Lib rocks DB dll when developing with Kafka Streams
Asked Answered
L

7

9

I'm writing a Kafka Streams application on my development Windows machine. If I try to use the leftJoin and branch features of Kafka Streams I get the error below when executing the jar application:

Exception in thread "StreamThread-1" java.lang.UnsatisfiedLinkError: C:\Users\user\AppData\Local\Temp\librocksdbjni325337723194862275.dll: Can't find dependent libraries
    at java.lang.ClassLoader$NativeLibrary.load(Native Method)
    at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
    at java.lang.Runtime.load0(Runtime.java:809)
    at java.lang.System.load(System.java:1086)
    at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(NativeLibraryLoader.java:78)
    at org.rocksdb.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:56)
    at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:64)
    at org.rocksdb.RocksDB.<clinit>(RocksDB.java:35)
    at org.rocksdb.Options.<clinit>(Options.java:22)
    at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:115)
    at org.apache.kafka.streams.state.internals.Segment.openDB(Segment.java:38)
    at org.apache.kafka.streams.state.internals.Segments.getOrCreateSegment(Segments.java:75)
    at org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStore.put(RocksDBSegmentedBytesStore.java:72)
    at org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStore.put(ChangeLoggingSegmentedBytesStore.java:54)
    at org.apache.kafka.streams.state.internals.MeteredSegmentedBytesStore.put(MeteredSegmentedBytesStore.java:101)
    at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:109)
    at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:101)
    at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:65)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:43)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:70)
    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:197)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:641)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:368)

It seems like Kafka does not find a DLL, but wait...I'm developing a Java application!

What could be the problem? And why this error doesn't show off if I try to do simpler streaming operations like only a filter?

UPDATE:

This problem raises only when a message is present in the broker. I'm using Kafka Streams version 0.10.2.1.

This is the piece of code which raises the problem

public class KafkaStreamsMainClass {

    private KafkaStreamsMainClass() {
    }

    public static void main(final String[] args) throws Exception {
        Properties streamsConfiguration = new Properties();
        streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "kafka-streams");
        streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-server:9092");
        streamsConfiguration.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "schema-registry:8082");
        streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 10 * 1000);
        streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
        streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
        streamsConfiguration.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
        KStreamBuilder builder = new KStreamBuilder();
        KStream<GenericRecord, GenericRecord> sourceStream = builder.stream(SOURCE_TOPIC);

        KStream<GenericRecord, GenericRecord> finishedFiltered = sourceStream
                .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") != null);

        KStream<GenericRecord, GenericRecord>[] branchedStreams = sourceStream
                .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") == null)
                .branch((GenericRecord key, GenericRecord value) -> value.get("firstField") != null,
                        (GenericRecord key, GenericRecord value) -> value.get("secondField") != null);

        branchedStreams[0] = finishedFiltered.join(branchedStreams[0],
                (GenericRecord value1, GenericRecord value2) -> {
                    return value1;
                }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));

        branchedStreams[1] = finishedFiltered.join(branchedStreams[1],
                (GenericRecord value1, GenericRecord value2) -> {
                    return value1;
                }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));

        KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
        streams.setUncaughtExceptionHandler((Thread thread, Throwable throwable) -> {
            throwable.printStackTrace();
        });
        streams.start();

        Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
    }

}

I opened the rocksdbjni-5.0.1.jar archive downloaded by Maven and it includes the librocksdbjni-win64.dll library. It seems that it is trying to retrieve the library from the outside of the RocksDB instead from the inner.

I'm developing on a Windows 7 machine.

Have you ever experienced this problem?

Linettelineup answered 2/5, 2017 at 16:1 Comment(8)
This is odd. For reference, I ran the mvn test suite of Confluent's Kafka Streams demo apps (github.com/confluentinc/examples) on Windows 10 w/ Oracle JDK 1.8 yesterday, and (with one exception due to a Kafka broker bug on Windows = unrelated to Kafka Streams) everything worked out of the box. Perhaps you can provide more detail about your environment (Windows version, Java version, etc.), the exact version of Kafka Streams you are using, and the code so that it's easier to reproduce?Kaffiyeh
I think I found the problem. In my local Maven repository I had two versions of RocksDB, version 4.4.1 and 5.0.1 (which is the one used by Kafka Streams 0.10.2 I'm using). I deleted the 4.4.1 version and the problem went away. The strange thing is that Maven was using the old version of the library..Linettelineup
Nope, the problem is again there. It was not showing off because I had no messages in the broker (they were deleted by the Kafka deletion job). I will update my question with the required informationLinettelineup
See also #41292496Shotgun
@Linettelineup - is this issue resolved? I am getting exactly same issue. strange thing same setup is working in another similar machine.Flosser
HI @Mudit. Nope, when I need to develop on Kafka Streams I use a Linux machine and everything worksLinettelineup
It's very strange that Kafka still not have support for Windows!!! Have you tried with cygwin also.Flosser
No but the problem is the compilation with Maven in a Windows environment. I think that Cygwin would not do the trick in this caseLinettelineup
L
3

I updated my kafka-streams project to the latest released version 1.0.0.

This version suffers of this bug but after patching it and uploading this patched version on the internal Artifactory server we were able to execute our kafka-streams agent both on Windows and on Linux. The next versions 1.0.1 and 1.1.0 will have this bug fix so as soon as one of these versions will be released we will switch to them instead of the patched version.

To sum up the Kafka guys solved this bug with the 1.0.0 release.

Linettelineup answered 19/1, 2018 at 11:38 Comment(0)
T
15

Recently I came through this problem too. I managed to solve this in two steps:

  1. Delete all librocksdbjni[...].dll files from C:\Users\[your_user]\AppData\Local\Temp folder.
  2. Add maven dependency for rocksdb in your project, this works for me: https://mvnrepository.com/artifact/org.rocksdb/rocksdbjni/5.0.1

Compile your Kafka Stream App and run it. It should work!

Tayyebeb answered 8/1, 2018 at 10:51 Comment(7)
Hi @David, thanks for the response. Does this solution work also in a Linux development environment, or messes up the classpath in such an environment?Linettelineup
Hello @gvdm, I'm sorry but I don't know because actually I only have access to develop in Windows.Tayyebeb
Ok @David, I will try as soon as I can and let you knowLinettelineup
Hi @David. I solved the problem upgrading to 1.0.0. More in the answer #43742923 . Thank you for your support.Linettelineup
Thank you very much, i was using Windows and upgrading version was not helpAugment
Why 5.0.1 works fine, but 5.7.3 included in dependencies by spring boot 2.1 management plugin doesn't work?Raster
How do I fix it on Mac?Fantastic
L
3

I updated my kafka-streams project to the latest released version 1.0.0.

This version suffers of this bug but after patching it and uploading this patched version on the internal Artifactory server we were able to execute our kafka-streams agent both on Windows and on Linux. The next versions 1.0.1 and 1.1.0 will have this bug fix so as soon as one of these versions will be released we will switch to them instead of the patched version.

To sum up the Kafka guys solved this bug with the 1.0.0 release.

Linettelineup answered 19/1, 2018 at 11:38 Comment(0)
S
1

My problem was permissions in /tmp/ directory (CentOS)

rockdb uses

java.io.tmpdir 

system property internally to decide where to place librocksdbjnifile, usually something like this /tmp/librocksdbjni2925599838907625983.so

Solved by setting different tempdir property with appropriate permissions in kafka-streams app.

System.setProperty("java.io.tmpdir", "/opt/kafka-streams/tmp");
Soever answered 20/1, 2020 at 10:25 Comment(0)
T
0

You are missing some native libraries that the rocksdb dll depends on. See https://github.com/facebook/rocksdb/issues/1302

Thinia answered 2/5, 2017 at 19:53 Comment(2)
Yes, it is obvious, but why the Maven Java console application is trying to use an external DLL library? It should use only stuff from the classpath!Linettelineup
Hi @Nicholas. I solved the problem upgrading to 1.0.0. More in the answer #43742923 . Thank you for your support.Linettelineup
F
0

I had same issue while using jdk 1.8. It got resolved when I changed it to jre.

Foch answered 20/5, 2020 at 7:48 Comment(0)
S
0

Faced similar issue in Mac. As per this link, https://github.com/facebook/rocksdb/issues/5064 issue is related to older libc installed in my version of Mac OS (10.11.6).

Strabismus answered 6/8, 2020 at 12:1 Comment(0)
W
0

I had this issue when using a distroless JRE image for my KStreams container application.

I switched from mcr.microsoft.com/openjdk/jdk:11-distroless to mcr.microsoft.com/openjdk/jdk:11-ubuntu and it went away

Wallford answered 4/10, 2023 at 15:34 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.