Passing shell script arguments to a subprocess

March 8th, 2012

So I want to create a shell script that ultimately exec's a command, after doing something like, say, setting an environment variable first:

export MY_VAR=MY_VAL
exec my_command $*

(The point of using `exec my_command` rather than plain `my_command` is to not leave a /bin/sh process waiting for my_command that shows up in pstree and makes my_command not see its "real" parent process and so on.)

Simple enough, except it doesn't work. If you run the script with `script "a b" c`, my_command's arguments will be a b c (that's three arguments instead of the original two).

(Update – as pointed out in the comments, "$@" instead of $* works fine, and is perfectly sufficient for my example where all the script does is setting an env var. "$@" isn't enough if you need to fiddle with the arguments – if you need to do that, read on, otherwise just use "$@".)

A common workaround seems to be, you iterate over the arguments and you quote them and then eval:

export MY_VAR=MY_VAL
for arg in "$@";
  args="$args '$arg'"
eval exec my_command $args

Not so simple, but works better: "a b" c will indeed be passed to my_command as "a b" c.

However, it doesn't work if the arguments contain single quotes. If you pass "'a'" (that's double quote, single quote, the character a, single quote, double quote), my_command will get plain a. If you pass "'a b'" (double, single, a, space, b, single, double), my_command will get two arguments, a b, instead of one, 'a b'.

What to do? One potential workaround is escaping quotes: replacing ' with ', etc. Perhaps someone with sufficiently thorough understanding of the Unix shell could pull it off; personally, I wouldn't trust myself to take care of all the special cases, or even to fully enumerate them.

So instead, what works for me (or so I hope) is, instead of creating a string of values by concatenating arguments, I make a string of references to the arguments using the variables $1, $2, etc. How many of those are needed – is the last one $3 or $7? Ah, we can use $# to figure that out:

export MY_VAR=MY_VAL
while [ $nargs -gt 0 ]
  args=""$$nargs" $args"
  nargs=`expr $nargs - 1`
eval exec my_command $args

This handsome code generates, given three arguments, the string "$1" "$2" "$3", and then evals it to get the three argument values, which apparently can't cause quoting complications that are dependent on the actual argument values. With five arguments, the string "$1" "$2" "$3" "$4" "$5" is generated, and so on. (You'd use more code if you wanted to mangle some of the arguments, which is the sole reason to do this sort of things as opposed to using "$@".)

If you're good at Unix and you know a less ""$$ugly" $way" to do this, and/or a more correct one, do tell.

(Why am I writing shell scripts in the first place, you ask? There are reasons for this too, and reasons for the reasons; there always are.)

Update 2: according to a comment, $1-$9 work in sh, but $10 and up do not; they do work in bash, which is what you actually get when you ask for /bin/sh on some systems but not others.

I really ought to try harder to stay away from shell scripting.  I mean, I know I shouldn't, but I keep coming back. I'm like those tribesmen around the world who can't resist the urge to drink alcohol and having genes making alcohol particularly addictive and bad for their health. I clearly don't have the genetic makeup that would make *sh reasonably harmless for me.

1. Barry KellyMar 8, 2012

Hm. I use:

exec my_command "$@"

which quotes it properly for you. For more complicated scenarios, I use bash arrays and "${args[@]}" – it's not usually worthwhile avoiding bash to get this all working portably.

2. DanielMar 8, 2012

See `man bash` in the section "Special Parameters". You want @, not *, in the individually quoted form, like Barry mentioned.

3. NathanMar 8, 2012

When performing basic arithmetic, at least in bash, you can do: nargs=$((nargs-1)) which should be faster than expr.

4. Yossi KreininMar 8, 2012

regarding the (()) syntax: yeah, but not in sh; Ubuntu's sh is a real sh and not a bash, for some reason, which breaks loads of scripts used to it being bash...

regarding "$@": it works if you don't need to fiddle with any of the arguments, which my example doesn't do (the SO example I linked to does.)

5. Yossi KreininMar 8, 2012

updated the text to mention "$@" – thanks!

6. LanceMar 8, 2012

I routinely do "sudo dpkg-reconfigure dash" on Ubuntu installations to revert to bash, too much script breakage with the default "dash" shell.

7. Petr ViktorinMar 8, 2012

And that is why shell sucks for anything over one line. It has lots of nifty shortcuts that save your time, but they tend to get in the way when you want robust scripts.
For one-liners, though, it's perfect.

Your ending comment should come a little earlier – the general solution for this should be to use a language where the arguments are in a proper $container of strings. Only when that's not possible, invoke black quote magic.

8. Anton KovalenkoMar 8, 2012

I'd recommend relying on splicing and unsplicing (the latter may be emulated, see below) as black boxes instead of tampering with single or double quotes.

This one is pure sh (I hope it won't be messed up when posted):


store() { r="$@"; }

for arg in "$@" ; do
echo "Fiddling with $arg"
store $r "$arg"

echo "Args after fiddling: $r"

There once was a bug in an ancient sh, making "$@" expand incorrectly into "" (one empty argument). That's why you can meet ${1+"$@"} in some scripts, used instead of "$@", but I don't think you can stumble upon a shell where it's still necessary.

9. alenversMar 9, 2012

One liner

args=`seq -s' ' -f '"$%1.0f"' 1 $#`

10. saurabhMar 9, 2012

Your workaround breaks down anyway for non-bash sh if you have more than 9 arguments.

11. Leandro LucarellaMar 9, 2012

About $(()), yes, you can do it in sh too, but you have to use the regular parameter expansion syntax inside it: $(($nargs-1)). And is POSIX, in case you wonder...

12. Decklin FosterMar 9, 2012

In some simpler cases you can use shift and set to modify the positional parameters. For example, to munge only the first one:

first="$1"; shift
upcased="$(echo "$first" | tr a-z A-Z)"
set — "$upcased" "$@"
exec my_command "$@"

Unfortunately I don't think (off the top of my head) there's a way to generalize this to unknown numbers of parameters.

13. Yossi KreininMar 10, 2012

OK, so you've all convinced me that my knowledge of the shell is insufficient for most practical purposes, and that I'd probably rather keep it that way than delude myself into believing that I'd ever improve... I'll try harder to use a language giving me an argument array next time.

And, as to there being no $i beyond $9 – thanks, I'll mention that...

14. Kartik AgaramMar 11, 2012

I'm surprised nobody's mentioned this, so I'll be that guy: use zsh.

#!/usr/bin/env zsh

15. AllanLane5Mar 13, 2012

Not to be snarky, but isn't this why Perl was invented? Portable, robust, and much more consistent than sh.

16. robohackMar 14, 2012

Here's a sample script I keep in my example sources for handy use.

It shows how to squirrel away "$@" for safe keeping and re-use.

It could work with V7 sh, I think, by using expr instead of shell arithmetic to increment the counter and of course by doing it inline instead of defining a function (and also of course losing the "local" declaration). One could probably do without expr too and just use $# and shift to decrement it, but that would lead to the confusing result in that the list_* variables would be numbered backwards.

A bit of trivia: with V7 sh, and probably most of its earlier derivatives, one would have to use ${1+"$@"} instead of just "$@" in order to avoid passing an empty parameter when none were originally supplied. See:

(AllanLane5: that was a joke right? You don't seriously think Perl is more consistent than sh, do you? LOL! Robust? ROTFL! maybe a bit more portable, but not since POSIX in, what, )

#! /bin/sh
# save_args — keep a list of quoted strings for later use via "$@"
# Note: creates a variable for each value, and uses these in another
# variable which can then be expanded with "eval" to re-reate the
# original parameters, preserving words containing whitespace
# See also for shell_quote() which can do the same, but
# produces its result on its stdout, and which does this simply using
# the normal single-quote characters.
# HOWEVER, shell_quote() requires printf(1) and sed(1), while this
# variant does it all internally to the shell.
local a c n
c=0; n=$1; shift

for a in "$@"; do
eval $n="$$n \"\$${n}_$c\""
eval ${n}_$c="$a"
# XXX Need to write free_args() to unset all the ${n}_$c variables!!!
# use:
# save_args list -foo bar "arg with spaces" "don't panic" '"Help!"'
# or:
# save_args list *.pdf # filenames with whitespace
# later:
# eval "gv ${list}"
# or, being a bit silly:
# . ./
# eval set — ${list}
# for str in "$@"; do
# q="$(shell_quote "$str")"
# list2="${list2}${list2:+ }${q}"
# done

17. Aristotle PagaltzisMar 23, 2012

Sure /bin/sh is a real sh – but does that also mean /bin/bash is absent? Because if you are worried only about Ubuntu, then the answer is No. And if the answer is No then you can just put /bin/bash in your shebang. (Just because Ubuntu makes you actually say that you want bash and not just sh when you do want bash and not just sh does not mean you cannot use bash at all.)

Then you can copy the args to an array variable using

local ARGS=( "$@" )

whose elements you can then molest to your heart’s desire. Once done, you run your program with

exec foo "${ARGS[@]}"

Very simple.

18. Aristotle PagaltzisMar 23, 2012

Kartik Agaram:

I believe the suggestion to use zsh falls under Yossi’s “use another language” clause. :-)

19. Yossi KreininMar 24, 2012

You know, I really should stay away from the shell. My mental faculties responsible for things like ("$@") and "${ARGS[@]}" are underdeveloped, so I'm just inflicting pain upon myself.

20. Rakesh SharmaMay 20, 2013

I'd rewrite the last line as:

exec my_command ${1+"$@"};

21. nfomonDec 20, 2013

SUSv3: "The parameter name or symbol can be enclosed in braces, which are optional except for positional parameters with more than one digit or when parameter is followed by a character that could be interpreted as part of the name."

That is, $10 is interpreted as $1 with a 0 afterwards. For positional parameters after $9, you should refer to them as ${10} etc.

Some shells (e.g. dash) fail this conformance and actually allow $10 to mean ${10}.

I'm working on a new non-POSIX command shell that cannot have this category of stupidity. Check it out: though note that this is EXTREMELY early, i.e. I haven't even really launched yet, and any feedback is appreciated.....

22. BilalOct 28, 2014

You save my day!

23. ahervertMay 7, 2015

nfomon that is the solution

24. Eduardo BustamanteMay 8, 2015

This is safer and more straightforward than using eval:

dualbus@yaqui ~ % ./ foo bar baz
foo barbar bazbaz
dualbus@yaqui ~ % cat

export ENV_VAR

for arg do
case $arg in
*[abc]*) arg=$arg$arg;;

set "$@" "$arg"

shift "$nargs"

echo "$@"

# In case this doesn't show correctly here:

25. dzanJan 10, 2016

Previous samles with 'set' work fine but marke sure to use

set — "$@" "$arg"

Otherwise if the first argument is a valid one for set it will use it, e.g. -x.

26. Yossi KreininJan 10, 2016

Thanks! Yeah, I've seen — passed to every other command in git's shell source code... "Do one job and do it well..."

Post a comment