UnsatisfiedLinkError: /tmp/snappy-1.1.4-libsnappyjava.so Error loading shared library ld-linux-x86-64.so.2: No such file or directory
Asked Answered
L

9

40

I am trying to run a Kafka Streams application in kubernetes. When I launch the pod I get the following exception:

Exception in thread "streams-pipe-e19c2d9a-d403-4944-8d26-0ef27ed5c057-StreamThread-1"
java.lang.UnsatisfiedLinkError: /tmp/snappy-1.1.4-5cec5405-2ce7-4046-a8bd-922ce96534a0-libsnappyjava.so: 
Error loading shared library ld-linux-x86-64.so.2: No such file or directory 
(needed by /tmp/snappy-1.1.4-5cec5405-2ce7-4046-a8bd-922ce96534a0-libsnappyjava.so)
        at java.lang.ClassLoader$NativeLibrary.load(Native Method)
        at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
        at java.lang.Runtime.load0(Runtime.java:809)
        at java.lang.System.load(System.java:1086)
        at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:179)
        at org.xerial.snappy.SnappyLoader.loadSnappyApi(SnappyLoader.java:154)
        at org.xerial.snappy.Snappy.<clinit>(Snappy.java:47)
        at org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:435)
        at org.xerial.snappy.SnappyInputStream.read(SnappyInputStream.java:466)
        at java.io.DataInputStream.readByte(DataInputStream.java:265)
        at org.apache.kafka.common.utils.ByteUtils.readVarint(ByteUtils.java:168)
        at org.apache.kafka.common.record.DefaultRecord.readFrom(DefaultRecord.java:292)
        at org.apache.kafka.common.record.DefaultRecordBatch$1.readNext(DefaultRecordBatch.java:264)
        at org.apache.kafka.common.record.DefaultRecordBatch$RecordIterator.next(DefaultRecordBatch.java:563)
        at org.apache.kafka.common.record.DefaultRecordBatch$RecordIterator.next(DefaultRecordBatch.java:532)
        at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.nextFetchedRecord(Fetcher.java:1060)
        at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1095)
        at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1200(Fetcher.java:949)
        at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:570)
        at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:531)
        at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1146)
        at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1103)
        at org.apache.kafka.streams.processor.internals.StreamThread.pollRequests(StreamThread.java:851)
        at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:808)
        at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:774)
        at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:744)

Previously I have tried launching kafka and kafka-streams-app using docker containers and they worked perfectly fine. This is the first time I am trying with Kubernetes.

This is my DockerFile StreamsApp:

FROM openjdk:8u151-jdk-alpine3.7

COPY /target/streams-examples-0.1.jar /streamsApp/

COPY /target/libs /streamsApp/libs

CMD ["java", "-jar", "/streamsApp/streams-examples-0.1.jar"]

What can I do to get past this issue? Kindly help me out.

EDIT:

/ # ldd /usr/bin/java 
    /lib/ld-musl-x86_64.so.1 (0x7f03f279a000)
Error loading shared library libjli.so: No such file or directory (needed by /usr/bin/java)
    libc.musl-x86_64.so.1 => /lib/ld-musl-x86_64.so.1 (0x7f03f279a000)
Error relocating /usr/bin/java: JLI_Launch: symbol not found
Linstock answered 11/5, 2018 at 8:19 Comment(3)
I think it's about cpu arch problem. did you check the java and linux whether the os and java are the same arch like all 32bit or 64bit ?Bumpkin
Could you please provide here the output of ldd /usr/bin/java (executed inside this docker) for further diagnostic?Dithyramb
@Dithyramb I have edited the question.Linstock
F
35

Error message states that *libsnappyjava.so cannot find ld-linux-x86-64.so.2. This is a glibc dynamic loader, while Alpine image doesn't run with glibc. You may try to get it running by installing libc6-compat package in your Dockerfile, e.g.:

RUN apk update && apk add --no-cache libc6-compat
Footsie answered 2/8, 2018 at 14:5 Comment(1)
This didn't solve the problem, I still had the UnsatisfiedLinkError. The solution mentioned by @Bebel solve the problem.Disadvantageous
P
40

In my case, install the missing libc6-compat didn't work. Application still throw java.lang.UnsatisfiedLinkError.

Then I find in the docker, /lib64/ld-linux-x86-64.so.2 exist and is a link to /lib/libc.musl-x86_64.so.1, but /lib only contains ld-musl-x86_64.so.1, not ld-linux-x86-64.so.2.

So I add a file named ld-linux-x86-64.so.2 linked to ld-musl-x86_64.so.1 in /lib dir and solve the problem.

Dockerfile I use:

FROM openjdk:8-jre-alpine
COPY entrypoint.sh /entrypoint.sh
RUN apk update && \
  apk add --no-cache libc6-compat && \
  ln -s /lib/libc.musl-x86_64.so.1 /lib/ld-linux-x86-64.so.2 && \
  mkdir /app && \
  chmod a+x /entrypoint.sh
COPY build/libs/*.jar /app
ENTRYPOINT ["/entrypoint.sh"]

In conclusion:

RUN apk update && apk add --no-cache libc6-compat
ln -s /lib/libc.musl-x86_64.so.1 /lib/ld-linux-x86-64.so.2
Peraea answered 8/4, 2019 at 7:37 Comment(2)
It should be sufficient to install gcompat instead of libc6-compat which provides /lib/ld-linux-x86-64.so.2: pkgs.alpinelinux.org/…Bebel
Saved my life. Thanks. For the library was already there. I just added the symbolic link.Guttle
F
35

Error message states that *libsnappyjava.so cannot find ld-linux-x86-64.so.2. This is a glibc dynamic loader, while Alpine image doesn't run with glibc. You may try to get it running by installing libc6-compat package in your Dockerfile, e.g.:

RUN apk update && apk add --no-cache libc6-compat
Footsie answered 2/8, 2018 at 14:5 Comment(1)
This didn't solve the problem, I still had the UnsatisfiedLinkError. The solution mentioned by @Bebel solve the problem.Disadvantageous
H
27

There are two solutions of this problem:

  1. You may use some other base image with pre-installed snappy-java lib. For example openjdk:8-jre-slim works fine for me

  2. And the other solution is to still use openjdk:8-jdk-alpine image as base one, but then install snappy-java lib manually:

FROM openjdk:8-jdk-alpine
RUN apk update && apk add --no-cache gcompat
...
Houseraising answered 23/9, 2019 at 14:55 Comment(2)
You suggestion to use slim actually worked for me. Thanks a lot!Language
The second solution also works if your base image is openjdk:8-alpineImpeditive
R
5

in docker with alpine kernel

run apk update && apk add --no-cache libc6-compat gcompat save my life

Rosenkrantz answered 9/3, 2021 at 3:15 Comment(0)
G
1

If you are adding docker file through build.sbt then correct way to do it is

dockerfile in docker := {
  val artifact: File = assembly.value
  val artifactTargetPath = s"/app/${artifact.name}"

  new Dockerfile {
    from("openjdk:8-jre-alpine")
    copy(artifact, artifactTargetPath)
    run("apk", "add", "--no-cache", "gcompat")
    entryPoint("java", "-jar", artifactTargetPath)
  }

installing gcompat will serve your purpose

Gust answered 7/1, 2021 at 4:53 Comment(0)
N
1
FROM openjdk:19-jdk-alpine
RUN apk update && apk add --no-cache gcompat
VOLUME /tmp
#ARG JAR_FILE
COPY target/*.jar app.jar
ENTRYPOINT ["java","-jar","/app.jar"]

Running RUN apk update && apk add --no-cache gcompat saved my life

Neuromuscular answered 22/2 at 11:29 Comment(0)
D
0

It seems strange, but looks like the docker image you use- openjdk:8u151-jdk-alpine3.7 is inconsistent, and some dynamically loaded objects are not included into the package, or you need to run “ldconfig -v” in this image to update map of the shared objects, or, at last, there is /etc/ld.so.conf with the paths to places where OS is looking for .so objects. Please consider using another docker image providing java binary if you do not want to lose time on debugging it. Last but not least, ask for a remedy on alpine forum.

Dithyramb answered 11/5, 2018 at 12:47 Comment(2)
Thanks for the answer. One more thing, Can it be that there is something wrong with Kafka's docker image?Linstock
It is hard to tell. It requires additional investigation to find where the problem is. I think you can try with docker image from another developer to have Kafka working.Dithyramb
K
0

I have implemented a docker image with which I run a Spring Boot microservice with a Kafka Strean Topology working perfectly.

Here I share the Dockerfile file.

FROM openjdk:8-jdk-alpine
# Add Maintainer Info
LABEL description="Spring Boot Kafka Stream IoT Processor"
# Args for image
ARG PORT=8080

RUN apk update && apk upgrade && apk add --no-cache gcompat
RUN ln -s /bin/bash /usr/bin
RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app


COPY resources/wait-for-it.sh  wait-for-it.sh
COPY target/iot_processor.jar app.jar

RUN dos2unix wait-for-it.sh
RUN chmod +x wait-for-it.sh
RUN uname -a
RUN pwd
RUN ls -al

EXPOSE ${PORT}

CMD ["sh", "-c", "echo 'waiting for 300 seconds for kafka:9092 to be accessable before 
starting application' && ./wait-for-it.sh -t 300 kafka:9092 -- java -jar app.jar"]

Hope it can help someone

Kaciekacy answered 11/9, 2020 at 17:51 Comment(0)
A
0

I don't need to add libc6-compat in dockerFile
Because the file /lib/libc.musl-x86_64.so.1 exist in my container

In dockerFile add only

run ln -s /lib/libc.musl-x86_64.so.1 /lib/ld-linux-x86-64.so.2

My container don't have error when consumming msg on snappy compressing

Exception in thread "streams-pipe-e19c2d9a-d403-4944-8d26-0ef27ed5c057-StreamThread-1"
java.lang.UnsatisfiedLinkError: /tmp/snappy-1.1.4-5cec5405-2ce7-4046-a8bd- 
922ce96534a0-libsnappyjava.so: 
Error loading shared library ld-linux-x86-64.so.2: No such file or directory 
(needed by /tmp/snappy-1.1.4-5cec5405-2ce7-4046-a8bd-922ce96534a0-libsnappyjava.so)
    at java.lang.ClassLoader$NativeLibrary.load(Native Method)
Adina answered 28/5, 2021 at 15:43 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.