java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0
Asked Answered
A

8

7

I cannot solve this exception, I've read the hadoop docu and all related stackoverflow questions that I could find.

My fileSystem.mkdirs(***) throws:

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:465)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:518)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:496)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:316)
...

I am including the following dependencies in my app (via maven pom.xml), all in version 2.6.0-cdh5.13.0: hadoop-common, hadoop-hdfs, hadoop-client, hadoop-minicluster

My filesystem variable is a valid (hadoop-common) FileSystem (org.apache.hadoop.fs.FileSystem).

I downloaded the hadoop files from https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin. I stored the winutils.exe and all the other files from in version 2.6.0 to my local file system under C:\Temp\hadoop\bin. I added the path variable HADOOP_HOME with C:\Temp\hadoop (yes, not the path to the bin directory).

The fallback is not used ("using builtin-java classes"), I am getting:

145 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader  - Trying to load the custom-built native-hadoop library...
147 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader  - Loaded the native-hadoop library

(See https://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/NativeLibraries.html)

I understood, that this exception can be caused by a hadoop version mismatch, but I checked that the imported hadoop matches the hadoop I stored locally, version wise.

I am working on a Windows 10 x64 system and in IntelliJ.

Anybody has an idea, what I could check or even, what I am doing wrong?

UPDATE: I run my main with the following VM options

-Dhadoop.home.dir=C:/Temp/hadoop
-Djava.library.path=C:/Temp/hadoop/bin

Without specifying the lib path, I get:

org.apache.hadoop.util.NativeCodeLoader  - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Askance answered 11/7, 2018 at 9:42 Comment(3)
The issue is with the version of the native library, i.e. a .DLL file. Perhaps, a different version is somewhere in one of the path entries (PATH or current directory).Halberd
thanks Holger, I checked, but there is not.Askance
Mmm, but your solution suggests that precisely that was the issue. As setting instead of appending implies not allowing to find the library in one of the other, preceding path components. You could use Process Monitor to find out, which library a process is actually loading.Halberd
A
1

The reason for this exception was:

I am importing 2.6.0-cdh5.13.0 via my maven pom, but I downloaded the pre-built files in version 2.6.0. Those are missing the changes made in the cdh5.13.0 variant (CDH is Cloudera’s platform that includes the Hadoop ecosystem). Hence, the versions are indeed in conflict.

If I import hadoop-common, hadoop-hdfs, hadoop-client like 2.6.0 instead of like 2.6.0-cdh5.13.0, the exception disappears (and I don't even need to set the VM options).

See http://archive-primary.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.13.0/hadoop-project-dist/hadoop-common/NativeLibraries.html

Askance answered 12/7, 2018 at 8:7 Comment(0)
D
3

To me setting VM Argument -Djava.library.path=C:\devTools\winutils-master\hadoop-3.0.0 resolved the issue.

Devoirs answered 9/1, 2019 at 14:54 Comment(1)
for me also. Thanks!Cryo
A
1

The reason for this exception was:

I am importing 2.6.0-cdh5.13.0 via my maven pom, but I downloaded the pre-built files in version 2.6.0. Those are missing the changes made in the cdh5.13.0 variant (CDH is Cloudera’s platform that includes the Hadoop ecosystem). Hence, the versions are indeed in conflict.

If I import hadoop-common, hadoop-hdfs, hadoop-client like 2.6.0 instead of like 2.6.0-cdh5.13.0, the exception disappears (and I don't even need to set the VM options).

See http://archive-primary.cloudera.com/cdh5/cdh/5/hadoop-2.6.0-cdh5.13.0/hadoop-project-dist/hadoop-common/NativeLibraries.html

Askance answered 12/7, 2018 at 8:7 Comment(0)
A
1

Check your java version. If java is 32bits version, you need uninstall and re-install with 64 bits version for hadoop.

Check command:

java -d32 -version;(no error, if 32 version)

java -d64 -version;(no error, if 64 version)

Adjoining answered 5/4, 2019 at 20:47 Comment(0)
H
1

Download hadoop.dll and winutils.exe files from hadoop-3.0.0 and got resolve

https://github.com/steveloughran/winutils/tree/master/hadoop-3.0.0/bin

Hereabout answered 29/9, 2020 at 6:17 Comment(1)
How did it solve the described problem? In what situation did it occur for you? Please elaborate, see How To AnswerAskance
M
0

In my case, the issue was that i had hadoop.dll in system32, if you run hadoop on windows and he finds hadoop.dll in System32 or in your HADOOP_HOME\bin then he consider that it's a cluster, the probleme is that clusters are not compatible with windows, so it fails.

Solution delete hadoop.dll from System32 and HADOOP_HOME\bin

see : Full answer

Miseno answered 2/8, 2022 at 14:44 Comment(0)
S
0

Had this issue and it turned out HADOOP_HOME was set to a 2.6.4 version folder. Updated it to a 3.0.0 folder and it worked. It seems in general either you have to update it on the calling end with -Djava.library.path or in your environment settings to 3.0.0.

For the environment variable, HADOOP_HOME is a system environment variable you can access via the command

rundll32.exe sysdm.cpl, EditEnvironmentVariables

This can be entered into ⊞ + R (windows key + R) followed by ctrl + shift and enter, or by opening powershell or cmd as admin.

From here I edited the system environment variable HADOOP_HOME to the 3.0.0 folder, and I updated the system PATH to HADOOP_HOME/bin. Be sure there are no conflicts in your user variables, such as in PATH.

After that, any terminal or program calling Spark should be restarted and checked to make sure it loaded the new environment variables.

Sidonius answered 28/9, 2022 at 9:28 Comment(0)
P
0

Even if in the pom.xml file it is set the version 2.6.0-cdh5.14.0, I have installed both winutils.exe and hadoop.dll from hadoop-2.7.1. Then, for some projects it started to work, but for others the problem remained.
So, I removed the file NativeIO.java from one of the src subfolders and then I executed clean, validate, and compile with maven. In this way, all the projects started to run correctly

Petrapetracca answered 7/4 at 8:52 Comment(0)
D
-1

I am having the same issue with writing parquet files in spark. Downloading the hadoop.dll and winutils.exe files from hadoop-3.0.0 and moving hadoop.dll to C:\Windows\System32 folder and moving winutils.exe to C:\hadoop\bin folder solved my problem.

Thanks Shailendra Singh for sharing the above link

Demesne answered 21/8, 2021 at 18:18 Comment(1)
This answer was already given by Shailendra. It should be a comment (for which you need to earn reputation) or an edit of Shailendra's answer.Askance

© 2022 - 2024 — McMap. All rights reserved.