How to pass command line parameters from a file
Asked Answered
I

3

6

I have a C program that reads command line arguments from argv. Is it possible to make a pipe to redirect the contents of a file as command line arguments to my program? Suppose I have a file arguments.dat with this content:

0 0.2 302 0

And I want my program to be called with:

./myprogram 0 0.2 302 0

I tried the following:

cat arguments.dat | ./myprogram

without success.

Isia answered 5/7, 2011 at 15:20 Comment(0)
F
9

With most shells, you can insert the contents of a file into a command line with $(<filename):

./myprogram $(<arguments.dat)

If your shell doesn't support that, then one of the older ways will work:

./myprogram $(cat arguments.dat)
./myprogram `cat arguments.dat`   # need this one with csh/tcsh

(You do know the difference between command line arguments and file input, right? Why would you expect to pipe command line arguments into a program?)

Fem answered 5/7, 2011 at 15:24 Comment(3)
"older ways" will not split arguments. All data from file will be loaded as single argument.Seignior
@osgx: That is simply untrue. To make it a single argument, you would need to put it inside double quotes.Unmeaning
If your file contains "first argument" "second argument", this would pass "first as one argument, argument" as second, "second as third. If it contains, *.txt, it'll replace that with a list of text files, rather than keeping it literal. If the file contains "*.txt", the quotes will be passed as a literal part of the string (if the user doesn't have nullglob or failglob causing the prior to be treated as a failed glob... well, failed if there aren't any files with names that start and end with literal double-quote characters).Rustcolored
C
17

xargs is your answer:

cat arguments.dat | xargs ./myprogram

Or easier:

xargs -a arguments.dat ./myprogram

Check the manual for the many ways to customize xargs. For example, you can read line-by-line rather than by word, and you can use the arguments in more complex replacements.

Comintern answered 5/7, 2011 at 15:21 Comment(6)
There are some caveats here -- if your file contains more arguments than fit on a single command line (and keep in mind that environment variables and command-line arguments use the same space -- so exporting too many, or too-large, command-line variables reduces the maximum command- line length), xargs will run ./myprogram more than once, splitting the arguments across the invocations.Rustcolored
@CharlesDuffy: Good point. That may be either a bug or a much-wanted feature, depending on the use case. In any case, I added another mechanism.Comintern
@CharlesDuffy: Yes, indeed. Bash also has a way to let you edit a command line with an editor. And xargs has many advanced ways of operating that let you quote the arguments and read line-wise and all that.Comintern
@CharlesDuffy: Yeah, that's definitely what I'd do in a reusable script. If it's for a one-off use, though, where you roughly know what the arguments are, the simple xargs is often all you need.Comintern
Sure. I'm trying to make an argument about correctness because a StackOverflow answer is by nature reusable -- it's (ideally) going to be read and used by a whole lot of people (in a lot of different scenarios, some of whose use cases may vary from the OP's), so time and effort spent on getting the details right is well worthwhile. That said, if I should just go add a competing answer, feel free to tell me so. :)Rustcolored
@CharlesDuffy: With ls -1 I'd use xargs -d \n... but why don' you post an answer? That should definitely go into a proper answer.Comintern
F
9

With most shells, you can insert the contents of a file into a command line with $(<filename):

./myprogram $(<arguments.dat)

If your shell doesn't support that, then one of the older ways will work:

./myprogram $(cat arguments.dat)
./myprogram `cat arguments.dat`   # need this one with csh/tcsh

(You do know the difference between command line arguments and file input, right? Why would you expect to pipe command line arguments into a program?)

Fem answered 5/7, 2011 at 15:24 Comment(3)
"older ways" will not split arguments. All data from file will be loaded as single argument.Seignior
@osgx: That is simply untrue. To make it a single argument, you would need to put it inside double quotes.Unmeaning
If your file contains "first argument" "second argument", this would pass "first as one argument, argument" as second, "second as third. If it contains, *.txt, it'll replace that with a list of text files, rather than keeping it literal. If the file contains "*.txt", the quotes will be passed as a literal part of the string (if the user doesn't have nullglob or failglob causing the prior to be treated as a failed glob... well, failed if there aren't any files with names that start and end with literal double-quote characters).Rustcolored
R
4

If You Don't Want Arguments To Be Silently Split

...which is to say: The below answers apply to cases where it wouldn't be acceptable for ./myprogram --first-argument "first value" to be silently changed into ./myprogram --first-argument; ./myprogram "first value".

If your arguments are one-to-a-line literals

That is, if your input looks like:

--first-argument
first value
--second-argument
second value

and you mean this to run:

./myprogram --first-argument "first value" --second-argument "second value"

...then you should use (with bash 4.0 or later):

readarray -t args <arguments.dat
./myprogram "${args[@]}"

...or (for bash 3.x as well):

args=( )
while IFS= read -r arg; do
  args+=( "$arg" )
done <arguments.dat
./myprogram "${args[@]}"

If your arguments are provided with quotes or escaping to distinguish them

That is, if your file contains something like (note that newlines and unquoted spaces behave identically here):

--first-argument "first value"
--second-argument "second value"

...and you mean this to run:

./myprogram --first-argument "first value" --second-argument "second value"

...then you should use:

args=( )
while IFS= read -r -d '' arg; do
  args+=( "$arg" )
done < <(xargs printf '%s\0' <arguments.dat)

If you control your argument format

Use NUL-delimited values. That is, create the file as so:

printf '%s\0' "argument one" "argument two" >arguments.dat

...and parse it as follows:

args=( )
while IFS= read -r -d '' arg; do
  args+=( "$arg" )
done <arguments.dat
./myprogram "${args[@]}"

This will work with all possible argument values, even ones with literal newlines, literal quotes, literal backslashes, or other nonprintable characters. (Literal NULs are not possible in UNIX command lines, since command lines are composed of NUL-terminated strings; thus, NUL is the only character which is completely safe to use to unambiguously separate arguments in a string).


If Splitting Arguments Across Invocations Is Desired

This subsection is relevant if the desired result (when there are more arguments in your file than can be passed to an invocation of your program) is multiple distinct invocations of the program, each one receiving a subset of arguments. This is a family of cases where xargs is the right tool for the job.

If on a GNU platform, you may want to run xargs -a arguments.dat instead of redirecting stdin; however, this isn't supported with BSD xargs (as on MacOS), and so is not demonstrated here.

If your arguments are one-to-a-line literals

With GNU xargs (most Linux platforms):

xargs -d $'\n' ./myprogram <arguments.dat

With BSD xargs (MacOS, FreeBSD/OpenBSD/etc):

xargs -0 ./myprogram < <(tr '\n' '\0' <arguments.dat)

If your arguments are provided with quotes or escaping to distinguish them

xargs ./myprogram <arguments.dat

If you've generated NUL-delimited inputs

xargs -0 ./myprogram <arguments.dat
Rustcolored answered 19/6, 2018 at 21:24 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.