ARG_MAX
is maximum length of arguments for a new process
You will see this error message, if you tried to call a program with too many arguments, that is, most likely in connection with pattern matching:
$ command *
It's only the exec()
system call and its direct variants, which will yield this error. They return the corresponding error condition E2BIG ().
The shell is not to blame, it just delivers this error to you.
In fact, shell expansion is not a problem, because here exec() is not needed, yet.
Expansion is only limited by the virtual memory system resources.
Thus the following commands work smoothly, because instead of handing over too many arguments to a new process, they only make use of a shell built-in (echo) or iterate over the arguments with a control structure (for loop):
/dir-with-many-files$ echo * | wc -c
/dir-with-many-files$ for i in * ; do grep ARG_MAX "$i"; done
There are different ways to learn the upper limit
command: getconf ARG_MAX
system call: sysconf(_SC_ARG_MAX)
system header: ARG_MAX in e.g. <[sys/]limits.h>
In contrast to the headers, sysconf
and getconf
tell the limit which is actually in effect.
This is relevant on systems which allow to change it at run time, by reconfiguration,
by recompiling (e.g. Linux) or by applying patches (HP-UX 10).
example usage of sysconf()
:
#include <stdio.h>
#include <unistd.h>
int main() {
return printf("ARG_MAX: %ld\n", sysconf(_SC_ARG_MAX));
}
A handy way to find the limits in your headers, if you have cpp installed:
cpp <<EOF
#include <limits.h>
#include <param.h>
#include <params.h>
#include <sys/limits.h>
#include <sys/param.h>
#include <sys/params.h>
arg_max: ARG_MAX
ncargs: NCARGS
EOF
When looking at ARG_MAX
/NCARGS
, you have to consider the space comsumption by both argv[]
and envp[]
(arguments and environment).
Thus you have to decrease ARG_MAX at least by the results of env|wc -c
and env|wc -l * 4
for a good estimation of the currently available space.
POSIX suggests to subtract 2048 additionally so that the process may savely modify its environment. A quick estimation with the getconf command:
expr `getconf ARG_MAX` - `env|wc -c` - `env|wc -l` \* 4 - 2048
The most reliable way to get the currently available space is to test the success of an exec() with increasing length of arguments until it fails.
This might be expensive, but at least you need to check only once, the length of envp[] is considered automatically, and the result is reliable.
alternatively, the GNU autoconf check "Checking for maximum length of command line arguments..." can be used. It works quite similar.
However, it results in a much lower value (it can be a fourth of the actual value only) both by intention and for reasons of simplicity:
In a loop with increasing n, the check tries an exec() with an argument length of 2n (but won't check for n higher than 16, that is 512kB).
The maximum is ARG_MAX/2 if ARG_MAX is a power of 2.
Finally, the found value is divided by 2 (for safety), with the reason "C++ compilers can tack on massive amounts of additional arguments".
The actual value
On Linux 2.6.23, it is 1/4th of stack size. Kernel code for reference.