Can I add arguments to python code when I submit spark job?
Asked Answered
G

5

52

I'm trying to use spark-submit to execute my python code in spark cluster.

Generally we run spark-submit with python code like below.

# Run a Python application on a cluster
./bin/spark-submit \
  --master spark://207.184.161.138:7077 \
  my_python_code.py \
  1000

But I wanna run my_python_code.pyby passing several arguments Is there smart way to pass arguments?

Goldina answered 26/8, 2015 at 2:43 Comment(0)
T
45

Yes: Put this in a file called args.py

#import sys
print sys.argv

If you run

spark-submit args.py a b c d e 

You will see:

['/spark/args.py', 'a', 'b', 'c', 'd', 'e']
Terms answered 26/8, 2015 at 2:50 Comment(0)
P
60

Even though sys.argv is a good solution, I still prefer this more proper way of handling line command args in my PySpark jobs:

import argparse

parser = argparse.ArgumentParser()
parser.add_argument("--ngrams", help="some useful description.")
args = parser.parse_args()
if args.ngrams:
    ngrams = args.ngrams

This way, you can launch your job as follows:

spark-submit job.py --ngrams 3

More information about argparse module can be found in Argparse Tutorial

Porche answered 27/5, 2016 at 13:18 Comment(5)
Not working! Results says " [TerminalIPythonApp] CRITICAL | Unrecognized flag: '--ngrams' "Inandin
If you have configs you want to send with your spark submit job, make sure to run with config info right after spark-submit, like: spark-submit --master somemasterurl job.py --ngrams 3Catalyze
Haven't tried this solution but this sounds a better one because it can remove the dependency on argument sequence.Cheyenne
Has anybody figured how to use Pyspark with argparse? I'm continually getting an error Unrecognized flag --arg1 and it's driving me insane! (Spark 2.4.4 and Python 3.6)Myriapod
i have a whole json of parameters that I need to pass. Is there a simple way of getting that embedded in the spark submit command?Doroteya
T
45

Yes: Put this in a file called args.py

#import sys
print sys.argv

If you run

spark-submit args.py a b c d e 

You will see:

['/spark/args.py', 'a', 'b', 'c', 'd', 'e']
Terms answered 26/8, 2015 at 2:50 Comment(0)
A
3

You can pass the arguments from the spark-submit command and then access them in your code in the following way,

sys.argv[1] will get you the first argument, sys.argv[2] the second argument and so on. Refer to the below example,

You can create code as below to take the arguments which you will be passing in the spark-submit command,

import os
import sys

n = int(sys.argv[1])
a = 2
tables = []
for _ in range(n):
    tables.append(sys.argv[a])
    a += 1
print(tables)

Save the above file as PysparkArg.py and execute the below spark-submit command,

spark-submit PysparkArg.py 3 table1 table2 table3

Output:

['table1', 'table2', 'table3']

This piece of code can be used in PySpark jobs where it is required to fetch multiple tables from the database and, the number of tables to be fetched & the table names will be given by the user while executing the spark-submit command.

Angelangela answered 19/9, 2019 at 16:14 Comment(0)
L
2

Aniket Kulkarni's spark-submit args.py a b c d e seems to suffice, but it's worth mentioning we had issues with optional/named args (e.g --param1).

It appears that double dashes -- will help signal that python optional args follow:

spark-submit --sparkarg xxx yourscript.py -- --scriptarg 1 arg1 arg2
Lambrequin answered 20/2, 2020 at 1:17 Comment(0)
G
1

Ah, it's possible. http://caen.github.io/hadoop/user-spark.html

spark-submit \
    --master yarn-client \   # Run this as a Hadoop job
    --queue <your_queue> \   # Run on your_queue
    --num-executors 10 \     # Run with a certain number of executors, for example 10
    --executor-memory 12g \  # Specify each executor's memory, for example 12GB
    --executor-cores 2 \     # Specify each executor's amount of CPUs, for example 2
    job.py ngrams/input ngrams/output
Goldina answered 26/8, 2015 at 2:45 Comment(1)
I think the question is not how to pass them in but rather how to access the arguments once they where passed inDeviant

© 2022 - 2024 — McMap. All rights reserved.