How do I make Hadoop find imported Python modules when using Python UDFs in Pig?
Asked Answered
V

3

7

I am using Pig (0.9.1) with UDFs written in Python. The Python scripts import modules from the standard Python library. I have been able to run the Pig scrips that call the Python UDFs successfully in local mode, but when I run on the cluster it appears Pig's generated Hadoop job is unable to find the imported modules. What needs to be done?

For example:

  • Does python (or jython) need to be installed on each task tracker node?
  • Do the python (or jython) modules need to be installed on each task tracker node?
  • Do the task tracker nodes need to know how to find the modules?
  • If so, how do you specify the path (via an environment variable - how is that done for the task tracker)?
Vickievicksburg answered 20/10, 2011 at 5:47 Comment(0)
D
6

Does python (or jython) need to be installed on each task tracker node?

Yes, since it's executed in task trackers.

Do the python (or jython) modules need to be installed on each task tracker node?

If you are using a 3rd party module, it should be installed in task trackers as well (like geoip, etc).

Do the task tracker nodes need to know how to find the modules? If so, how do you specify the path (via an environment variable - how is that done for the task tracker)?

As an answer from the book "Programming Pig" :

register is also used to locate resources for Python UDFs that you use in your Pig Latin scripts. In this case you do not register a jar, but rather a Python script that contains your UDF. The Python script must be in your current directory.

And also this one is important :

A caveat, Pig does not trace dependencies inside your Python scripts and send the needed Python modules to your Hadoop cluster. You are required to make sure the modules you need reside on the task nodes in your cluster and that the PYTHONPATH environment variable is set on those nodes such that your UDFs will be able to find them for import. This issue has been fixed after 0.9, but as of this writing not yet released.

And if you are using jython :

Pig does not know where on your system the Jython interpreter is, so you must include jython.jar in your classpath when invoking Pig. This can be done by setting the PIG_CLASSPATH environment variable.

As a summary, if you are using streaming then you can use "SHIP" command in pig which would send your executable files to cluster. if you are using UDF, as long as it can be compiled(check the note about jython) and doesn't have 3rd party dependency in it (which you didn't already put in PYTHONPATH / or installed in cluster), the UDF would be shipped to cluster when executed. (As a tip, it would make your life much more easier if you put your simple UDF dependencies in the same folder with pig script when registering)

Hope these would clear things.

Diminutive answered 20/10, 2011 at 7:9 Comment(0)
N
2

Adding

pig -Dmapred.child.env="JYTHONPATH=job.jar/Lib" script.pig

Works. Note that you could also add the following lines to your python script:

import sys
sys.path.append('./Lib')

Also note that you're still going to get numerous "module not found" warnings, but the fix works. The fact that you're getting these warnings even though the modules are inf fact found eventually was incredibly confusing for me, and I always killed the hadoop job before it returned correctly, believing this to be a symptom of the fix not actually working...

New answered 18/8, 2014 at 18:48 Comment(0)
C
1

I encountered the same issue using Hadoop 1.2.1 and Pig 0.11.1 and found a workaround from PIG-2433, which was to add -Dmapred.child.env="JYTHONPATH=job.jar/Lib" to my Pig arguments. Example:

pig -Dmapred.child.env="JYTHONPATH=job.jar/Lib" script.pig
Chantey answered 2/7, 2014 at 23:25 Comment(2)
@dksahuji: When Pig launches map-reduce jobs, it puts all the classes and files it needs in a job.jar file that gets sent to all the MR tasks.Chantey
but generally it is like jobxxxxxx.jar where xxxxx changes. a single pig script launches multiple jobxxxxx.jar files.Genteelism

© 2022 - 2024 — McMap. All rights reserved.