UnsatisfiedLinkError (NativeIO$Windows.access0) when submitting mapreduce job to hadoop 2.2 from windows to ubuntu
Asked Answered
L

4

16

I submit my mapreduce jobs from a java application running on windows to the hadoop 2.2 cluster running on ubuntu. In hadoop 1.x this worked as expected but on hadoop 2.2 I get a strange Error:

java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

I compiled the necesary windows libraries (hadoop.dll and winutils.exe) and can access the hdfs via code and read the cluster information using hadoop API. Only the job submission does not work.

Any help is aprecciated.

Solution: I found it out myself, the path where the windows hadoop binaries can be found has to be added to the PATH variable of windows.

Loki answered 14/12, 2013 at 13:55 Comment(4)
Hi add msvcr100.dll file to '${HADOOP_HOME}\bin' path.. me too face same problem..Tallu
I think the answer at https://mcmap.net/q/176093/-running-apache-hadoop-2-1-0-on-windows might help you here, it shows how you can check if there are some MSVC system libraries missing on your box.Baillieu
possible duplicate of Running Apache Hadoop 2.1.0 on WindowsBaillieu
Possible duplicate of Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)ZCentum
G
4
  1. Get hadoop.dll (or libhadoop.so on *x). Make sure to match bitness (32- vs. 64-bit) with your JVM.
  2. Make sure it is available via PATH or java.library.path.

    Note that setting java.library.path overrides PATH. If you set java.library.path, make sure it is correct and contains the hadoop library.

Gunflint answered 4/9, 2018 at 11:19 Comment(0)
M
2

This error generally occurs due to the mismatch in your binary files in your %HADOOP_HOME%\bin folder. So, what you need to do is to get hadoop.dll and winutils.exe specifically for your hadoop version.

Get hadoop.dll and winutils.exe for your specific hadoop version and copy them to your %HADOOP_HOME%\bin folder.

Metic answered 17/10, 2017 at 5:43 Comment(0)
F
0

I have been having issues with my Windows 10 Hadoop installation since morning where the NameNode and DataNode were not starting due the mismatch in the binary files. The issues were resolved after I replaced the bin folder with the one that corresponds with the version of my Hadoop. Possibly, the bin folder I replaced with the one that came with the installation was for a different version, I don't know how it happened. If all your configurations are intact, you might want to replace the bin folder with a version that correspond with your Hadoop installation.

Featherhead answered 21/11, 2019 at 4:19 Comment(0)
A
0

After putting hadoop.dll and winutils in "hadoop/bin" folder and adding the folder of hadoop to PATH, we also need to put hadoop.dll into "C:\windows\System32" folder.

So basically we had to copy hadoop.dll into the "C:\windows\System32" folder to fix this exception.

Analogize answered 6/6, 2024 at 8:28 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.