how to use jni in spark?
Asked Answered
M

2

7

I want to use jni to call my c++ lib in spark. When i sbt run my program, it shows that java.lang.UnsatisfiedLinkError: no hq_Image_Process in java.library.path , so obviously the program can not find my hq_Image_Process.so .

In hadoop, -files can distribute the xxx.so file to the slaves like this:

[hadoop@Master ~]$ hadoop jar JniTest3.jar -files /home/hadoop/Documents/java/jni1/bin/libFakeSegmentForJni.so FakeSegmentForJni.TestFakeSegmentForJni input output

Are there any ways to call my hq_Image_Process.so like hadoop in spark? I would appreciate any help.

Marigolda answered 14/1, 2014 at 5:18 Comment(0)
B
16

First of all, the native library must be preinstalled on all worker nodes. Path to that library must be specified in spark-env.sh:

export SPARK_LIBRARY_PATH=/path/to/native/library

SPARK_PRINT_LAUNCH_COMMAND environment variable might be used to diagnose it:

export SPARK_PRINT_LAUNCH_COMMAND=1

If everything's set correctly, you will see output like this:

Spark Command:
/path/to/java -cp <long list of jars> -Djava.library.path=/path/to/native/library <etc>
========================================
Bevvy answered 15/1, 2014 at 15:10 Comment(3)
hi wildfire,i distributed my libxxx.so to /usr/lib on every node. It seems everything is ok now. Thank you~Marigolda
useful info if I need to distribute a model againElation
Is there any way to preinstall the library in all nodes automatically?Cle
E
0

Solution of accepted answer was for older (<1.0) Spark versions.

You need to set the following properties (either or both) on your spark-defaults.conf.

spark.driver.extraLibraryPath   /path/to/native/library
spark.executor.extraLibraryPath /path/to/native/library

The property keys are documented at Configuration Section of Spark docs.

Election answered 25/3, 2021 at 9:37 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.