How do I set a variable to the output of a command in Bash?
Asked Answered
U

16

2411

I have a pretty simple script that is something like the following:

#!/bin/bash

VAR1="$1"
MOREF='sudo run command against $VAR1 | grep name | cut -c7-'

echo $MOREF

When I run this script from the command line and pass it the arguments, I am not getting any output. However, when I run the commands contained within the $MOREF variable, I am able to get output.

How can one take the results of a command that needs to be run within a script, save it to a variable, and then output that variable on the screen?

Urge answered 10/1, 2011 at 20:58 Comment(7)
A related question #25117021Vermiculate
As an aside, all-caps variables are defined by POSIX for variable names with meaning to the operating system or shell itself, whereas names with at least one lowercase character are reserved for application use. Thus, consider using lowercase names for your own shell variables to avoid unintended conflicts (keeping in mind that setting a shell variable will overwrite any like-named environment variable).Berners
As an aside, capturing output into a variable just so you can then echo the variable is a useless use of echo, and a useless use of variables.Soelch
As a further aside, storing output in variables is often unnecessary. For small, short strings you will need to reference multiple times in your program, this is completely fine, and exactly the way to go; but for processing any nontrivial amounts of data, you want to reshape your process into a pipeline, or use a temporary file.Soelch
Variation: "I know how to use variable=$(command) but I think "$string" is a valid command"; #37195295Soelch
@tripleee: ... I would suspect that he is NOT trying to store the output in a variable simply so he can echo it back out. I'm basing this off the fact that maybe not everyone always wants to share their code verbatim, and also/or want to keep it more focused on the core issue so they can get help AND help more people in the future. ... ALSO, there is the first line where they straight up said that they "had a pretty simple script that is SOMETHING like the following:" So there's that ;-) I guess at the same time your links sparked my curiosity so they ARE still appreciated.Submiss
The title ought to be more specific, as this question covers the case of (variable) input to the external command (in a (Bash) variable." (not just a constant string for the external command).Stewart
E
3170

In addition to backticks `command`, command substitution can be done with $(command) or "$(command)", which I find easier to read, and allows for nesting.

OUTPUT=$(ls -1)
echo "${OUTPUT}"

MULTILINE=$(ls \
   -1)
echo "${MULTILINE}"

Quoting (") does matter to preserve multi-line variable values; it is optional on the right-hand side of an assignment, as word splitting is not performed, so OUTPUT=$(ls -1) would work fine.

Erin answered 10/1, 2011 at 21:4 Comment(21)
Can we provide some separator for multi line output ?Sophey
White space (or lack of whitespace) mattersPassifloraceous
so echo "${OUTPUT}" preserves line breaks, whereas echo $OUTPUT doesn't?Hippogriff
@timhc22, the curly braces are irrelevant; it's only the quotes that are important re: whether expansion results are string-split and glob-expanded before being passed to the echo command.Berners
Ah thanks! So is there any benefit to the curly braces?Hippogriff
Curly braces can be used when the variable is immediately followed by more characters which could be interpreted as part of the variable name. e.g. ${OUTPUT}foo. They are also required when performing inline string operations on the variable, such as ${OUTPUT/foo/bar}Helical
ok, what do you do if you need to quote something inside of the quoted command expansion? I have tried using 's and they don't workNarvik
@Sophey Multi line output is just output containing newline characters; They can be converted to something else, but that would be handled independently. Try OUTPUT="$(seq 3)" versus OUTPUT="$(seq 3 | tr '\n' :)"Harder
Also: no spaces are permitted between OUTPUT, =, and the quote. This tripped me up.Watchful
How it would be the case where there is argument passing, like this call OUTPUT=$(./myscript "$@") (this actually does not work at all). How to keep the structure of the arguments when the original script is run, for example, with ./script.sh "hello world" rest ?Bourdon
is it possible to explain why the command works the way you suggested it?Reciprocal
@nightcod3r, eh? output=$(./myscript "$@") works perfectly well.Berners
@CharlesDuffy, @nightcod3r, the difference between your two code examples is that Charles has added a space after myscript.Hypogastrium
Taking it down to the bare minimum: a=$(ls -l) && echo "$a" The quotes around the $a is the key to preserving newlines.Hemistich
I'm trying something similar with jps : out=$(jps) && echo "$out" works perfectly well in terminal, but the line out=$(jps) && echo "$out" >> /path/to/logfile in a bash script prints an empty line in the logfile. What might be wrong ? When I replace "jps" with "date", it also works also in the bash script !Earthnut
So amazed that echo (as well as printf) is possible to act as if an eval, which means it could be as evil as eval - the command in text actually gets executed, prone to code injection attack. text='ls -l' ; printf '%s\n' "$($text)"Nananne
But echo or printf is less capable than eval that IF the ls arguments contains inner quotes of spaces, spaces will get split, which is not intended to be. f() { printf '[%s] ' "$@"; printf '\n'; } ; c='f a " with space " c' ; printf '%s\n' "$($c)" ; eval "$c"Nananne
Can you please explain why "${OUTPUT}" over $OUTPUT?Hertzog
@Hertzog It's discussed in the previous comments.Erin
Apparently on some shells a semicolon is required to assign the variable to the subshells stdout OUTPUT="$(ls -1)"; echo "${OUTPUT}"Shimmy
Why do you say "apparently"? What makes you think it is?Erin
U
377
$(sudo run command)

If you're going to use an apostrophe, you need `, not '. This character is called "backticks" (or "grave accent"):

#!/bin/bash

VAR1="$1"
VAR2="$2"

MOREF=`sudo run command against "$VAR1" | grep name | cut -c7-`

echo "$MOREF"
Untuck answered 10/1, 2011 at 21:0 Comment(5)
The backtick syntax is obsolescent, and you really need to put double quotes around the variable interpolation in the echo.Soelch
I would add that you have to be careful with the spaces around '=' in the assignment above. You shouln't have any spaces there, otherwise you'll get an incorrect assignmentChian
tripleeee's comment is correct. In cygwin (May 2016), `` doesn't work while $() works. Couldn't fix until I saw this page.Despicable
Elaboration such as an example on Update (2018) would be appreciated.Zeigler
The original Bourne shell supported backticks, but not $(...) notation. So you need to use backticks if you require compatibility with older Unix systems.Slumgullion
D
203

Edit 2023: Add special characters and mapfile...

Some Bash tricks I use to set variables from commands

Sorry, there is a loong answer. But as is a , where the main goal is to run other commands and react on result code and/or output ( commands are often piped filter, etc... ), storing command output in variables is something basic and fundamental.

Therefore, depending on

  • compatibility ()
  • kind of output (filter(s))
  • number of variable to set (split or interpret)
  • execution time (monitoring)
  • error trapping
  • repeatability of request (see long running background process, further)
  • interactivity (considering user input while reading from another input file descriptor)
  • parallelism (considering many inputs simultaneously, even interactively)
  • handling of special characters. (New 2023)
    • handling multiline fields in CSV files
  • having to compute stats, rates, sums, or else, while reading datas
  • having to track/retrieve handler, then search for them further in same stream (smtp mail server logs)
  • do I miss something?

You could look at showCert function - a complex sample, parsing openssl output for building: 1 associative array for parsing SUBJECT field, 1 standard array for parsing alternatives names and storing dates to UNIXEPOCH. (Using a single fork to date command for converting two dates together) - In How to determine SSL cert expiration date from a PEM certificate?

First simple, old (obsolete), and compatible way

myPi=`echo '4*a(1)' | bc -l`
echo $myPi 
3.14159265358979323844

Compatible, second way

As nesting could become heavy, parenthesis was implemented for this

myPi=$(bc -l <<<'4*a(1)')

Using backticks in script is to be avoided today.

Nested sample:

SysStarted=$(date -d "$(ps ho lstart 1)" +%s)
echo $SysStarted 
1480656334

features

Reading more than one variable (with Bashisms)

df -k /
Filesystem     1K-blocks   Used Available Use% Mounted on
/dev/dm-0         999320 529020    401488  57% /

If I just want a used value:

array=($(df -k /))

you could see an array variable:

declare -p array
declare -a array='([0]="Filesystem" [1]="1K-blocks" [2]="Used" [3]="Available" [
4]="Use%" [5]="Mounted" [6]="on" [7]="/dev/dm-0" [8]="999320" [9]="529020" [10]=
"401488" [11]="57%" [12]="/")'

Then:

echo ${array[9]}
529020

But I often use this:

{ read -r _;read -r filesystem size using avail prct mountpoint ; } < <(df -k /)
echo $using
529020

( The first read _ will just drop header line. ) Here, in only one command, you will populate 6 different variables (shown by alphabetical order):

declare -p avail filesystem mountpoint prct size using
declare -- avail="401488"
declare -- filesystem="/dev/dm-0"
declare -- mountpoint="/"
declare -- prct="57%"
declare -- size="999320"
declare -- using="529020"

Or

{ read -a head;varnames=(${head[@]//[K1% -]});varnames=(${head[@]//[K1% -]});
  read ${varnames[@],,} ; } < <(LANG=C df -k /)

Then:

declare -p varnames ${varnames[@],,} 
declare -a varnames=([0]="Filesystem" [1]="blocks" [2]="Used" [3]="Available" [4]="Use" [5]="Mounted" [6]="on")
declare -- filesystem="/dev/dm-0"
declare -- blocks="999320"
declare -- used="529020"
declare -- available="401488"
declare -- use="57%"
declare -- mounted="/"
declare -- on=""

Or even:

{ read _ ; read filesystem dsk[{6,2,9}] prct mountpoint ; } < <(df -k /)
declare -p mountpoint dsk
declare -- mountpoint="/"
declare -a dsk=([2]="529020" [6]="999320" [9]="401488")

(Note Used and Blocks is switched there: read ... dsk[6] dsk[2] dsk[9] ...)

... will work with associative arrays too: read _ disk[total] disk[used] ...

More complex sample parsing Free:

getFree() {
    local -a hline sline;
    { 
        read -ra hline;
        sline=("${hline[@]::3}");
        sline=("${sline[@]^}");
        hline=("${hline[@]/%/]}") sline=("${sline[@]/%/]}");
        read -r _ "${hline[@]/#/memInfos[}";
        read -r _ "${sline[@]/#/memInfos[swap}"
    } < <(LANG=C free --wide --kilo)
}
declare -A memInfos='()'
getFree

Then

declare -p memInfos 
declare -A memInfos=([swapTotal]="104853" [cache]="246161" [free]="32518" [share
d]="925" [available]="238936" [used]="88186" [total]="386928" [swapFree]="78639"
 [buffers]="20062" [swapUsed]="26214" )

So

for var in total used free shared buffers cache available; do
    case $var in 
        tot*|use*|fre*) sval=${memInfos[swap${var^}]} ;;
        *) sval='' ;;
    esac
    printf ' - %-12s %12s %12s\n' "$var" "${memInfos[$var]}" "$sval"
done

could produce something like:

 - total              386928       104853
 - used                88186        26214
 - free                32518        78639
 - shared                925             
 - buffers             20062             
 - cache              246161             
 - available          238936             

Other related sample: Parsing xrandr output: and end of Firefox tab by bash in a size of x% of display size? or at AskUbuntu.com Parsing xrandr output

Dedicated fd using unnamed fifo:

There is an elegent way! In this sample, I will read /etc/passwd file:

users=()
while IFS=: read -u $list user pass uid gid name home bin ;do
    ((uid>=500)) &&
        printf -v users[uid] "%11d %7d %-20s %s\n" $uid $gid $user $home
done {list}</etc/passwd

Using this way (... read -u $list; ... {list}<inputfile) leave STDIN free for other purposes, like user interaction.

Then

echo -n "${users[@]}"
       1000    1000 user         /home/user
...
      65534   65534 nobody       /nonexistent

and

echo ${!users[@]}
1000 ... 65534

echo -n "${users[1000]}"
      1000    1000 user       /home/user

This could be used with static files or even /dev/tcp/xx.xx.xx.xx/yyy with x for ip address or hostname and y for port number or with the output of a command:

{
    read -u $list -a head          # read header in array `head`
    varnames=(${head[@]//[K1% -]}) # drop illegal chars for variable names
    while read -u $list ${varnames[@],,} ;do
        ((pct=available*100/(available+used),pct<10)) &&
            printf "WARN: FS: %-20s on %-14s %3d <10 (Total: %11u, Use: %7s)\n" \
                "${filesystem#*/mapper/}" "$mounted" $pct $blocks "$use"
     done
 } {list}< <(LANG=C df -k)

And of course with inline documents:

while IFS=\; read -u $list -a myvar ;do
    echo ${myvar[2]}
done {list}<<"eof"
foo;bar;baz
alice;bob;charlie
$cherry;$strawberry;$memberberries
eof

Handling of special characters

A common problem is to correctly handling filenames (for sample) with special characters like old latin encoding mixed with utf8 of worse (filename containing newline or tabulation).

For this, find command could be run with -print0 for separating filenames found by null byte 0x00.

To correctly handle this output with , you could:

while IFS='' read -r -d '' filename; do
    size=$(stat -c %s "$filename")
    printf ' %13d %q\n' $size "$filename"
done < <(
  find . \( -type f -o -type d \) -print0
) 

Handling of special characters by using mapfile

For small amount of entries, you could use mapfile (or his sysnonyme: readarray) in order to create an array, before processing his elements:

mapfile -t -d '' entries < <( find . \( -type f -o -type d \) -print0)
for entry in "${entries[@]}";do
    size=$(stat -c %s "$entry")
    printf ' %13d %q\n' $size "$entry"
done

This could by used for splitting specials procfs entries which are null separated, like environ file:

mapfile -d '' env_$$ </proc/$$/environ
declare -p ${!env_*}

Practical sample parsing CSV files:

As this answer is loong enough, for this paragraph, I just will let you refer to this answer to How to parse a CSV file in Bash?, I read a file by using an unnamed fifo, using syntax like:

exec {FD}<"$file"   # open unnamed fifo for read
IFS=',' read -ru $FD -a headline
while IFS=',' read -ru $FD -a row ;do ...

... But as CSV format could hold multiline fields, things are a little more complex! Using loadable CSV module, please have a look on Parsing CSV files under bash, using loadable module

On my website, you may find the same script, reading CSV as inline document.

Sample function for populating some variables:

#!/bin/bash

declare free=0 total=0 used=0 mpnt='??'

getDiskStat() {
    {
        read _
        read _ total used free _ mpnt
    } < <(
        df -k ${1:-/}
    )
}

getDiskStat $1
echo "$mpnt: Tot:$total, used: $used, free: $free."

Nota: declare line is not required, just for readability.

About sudo cmd | grep ... | cut ...

shell=$(cat /etc/passwd | grep $USER | cut -d : -f 7)
echo $shell
/bin/bash

(Please avoid useless cat! So this is just one fork less:

shell=$(grep $USER </etc/passwd | cut -d : -f 7)

All pipes (|) implies forks. Where another process have to be run, accessing disk, libraries calls and so on.

So using sed for sample, will limit subprocess to only one fork:

shell=$(sed </etc/passwd "s/^$USER:.*://p;d")
echo $shell

And with Bashisms:

But for many actions, mostly on small files, Bash could do the job itself:

while IFS=: read -a line ; do
    [ "$line" = "$USER" ] && shell=${line[6]}
  done </etc/passwd
echo $shell
/bin/bash

or

while IFS=: read loginname encpass uid gid fullname home shell;do
    [ "$loginname" = "$USER" ] && break
  done </etc/passwd
echo $shell $loginname ...

Going further about variable splitting...

Have a look at my answer to How do I split a string on a delimiter in Bash?

Alternative: reducing forks by using backgrounded long-running tasks

In order to prevent multiple forks like

myPi=$(bc -l <<<'4*a(1)'
myRay=12
myCirc=$(bc -l <<<" 2 * $myPi * $myRay ")

or to obtain system start time and current shell start time, both as UNIX EPOCH, I could do two nested forks:

myStarted=$(date -d "$(ps ho lstart 1)" +%s)
mySessStart=$(date -d "$(ps ho lstart $$)" +%s)

This work fine, but running many forks is heavy and slow.

And commands like date and bc could make many operations, line by line!!

See:

bc -l <<<$'3*4\n5*6'
12
30

date -f - +%s < <(ps ho lstart 1 $$)
1516030449
1517853288

So building my two variables: $myStarted and $mySessStart could be done in one operation:

{
    read -r myStarted
    read -r mySessStart
} < <(
    date -f - +%s < <(
        ps ho lstart 1 $$
    )
)

could be written on one line:

{ read -r myStarted;read -r mySessStart;}< <(date -f- +%s< <(ps ho lstart 1 $$))

Backgrounded tasks

But we could use a long running background process to make as many request we need, without having to initiate a new fork for each request.

You could have a look how reducing forks make Mandelbrot bash, improve from more than eight hours to less than five seconds.

Under , there is a built-in function: coproc:

coproc bc -l
echo 4*3 >&${COPROC[1]}
read -u $COPROC answer
echo $answer
12

echo >&${COPROC[1]} 'pi=4*a(1)'
ray=42.0
printf >&${COPROC[1]} '2*pi*%s\n' $ray
read -u $COPROC answer
echo $answer
263.89378290154263202896

printf >&${COPROC[1]} 'pi*%s^2\n' $ray
read -u $COPROC answer
echo $answer
5541.76944093239527260816

As bc is ready, running in background and I/O are ready too, there is no delay, nothing to load, open, close, before or after operation. Only the operation himself! This become a lot quicker than having to fork to bc for each operation!

The little extra: (Little but powerful!) While bc stay running, they will hold all his registers. So variables or functions could be defined at initialisation step, as first write to ${COPROC[1]}, just after starting the task (... or even at any time).

Into a function newConnector

You may found my newConnector function on GitHub.Com or on my own site (Note on GitHub: there are two files on my site. Function and demo are bundled into one unique file which could be sourced for use or just run for demo.)

Sample:

source shell_connector.sh

tty
/dev/pts/20

ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
  29019 pts/20   Ss     0:00 bash
  30745 pts/20   R+     0:00  \_ ps --tty pts/20 fw

newConnector /usr/bin/bc "-l" '3*4' 12

ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
  29019 pts/20   Ss     0:00 bash
  30944 pts/20   S      0:00  \_ /usr/bin/bc -l
  30952 pts/20   R+     0:00  \_ ps --tty pts/20 fw

declare -p PI
bash: declare: PI: not found

myBc '4*a(1)' PI
declare -p PI
declare -- PI="3.14159265358979323844"

The function myBc lets you use the background task with simple syntax.

Then for date:

newConnector /bin/date '-f - +%s' @0 0
myDate '2000-01-01'
  946681200
myDate "$(ps ho lstart 1)" boottime
myDate now now
read utm idl </proc/uptime
myBc "$now-$boottime" uptime
printf "%s\n" ${utm%%.*} $uptime
  42134906
  42134906

ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
  29019 pts/20   Ss     0:00 bash
  30944 pts/20   S      0:00  \_ /usr/bin/bc -l
  32615 pts/20   S      0:00  \_ /bin/date -f - +%s
   3162 pts/20   R+     0:00  \_ ps --tty pts/20 fw

From there, if you want to end one of background processes, you just have to close its fd:

eval "exec $DATEOUT>&-"
eval "exec $DATEIN>&-"
ps --tty pts/20 fw
    PID TTY      STAT   TIME COMMAND
   4936 pts/20   Ss     0:00 bash
   5256 pts/20   S      0:00  \_ /usr/bin/bc -l
   6358 pts/20   R+     0:00  \_ ps --tty pts/20 fw

which is not needed, because all fd close when the main process finishes.

Downes answered 20/12, 2016 at 7:6 Comment(10)
The nested sample above is what I was looking for. There may be a simpler way, but what I was looking for was the way to find out if a docker container already exists given its name in an environment variable. So for me: EXISTING_CONTAINER=$(docker ps -a | grep "$(echo $CONTAINER_NAME)") was the statement I was looking for.Elastomer
@capricorn1 That's a useless use of echo; you want simply grep "$CONTAINER_NAME"Soelch
See this sample using tput as background task for color rendering on terminal!!Downes
I probably miss something here: kubectl get ns | while read -r line; do echo $line | grep Term | cut -d' ' -f1; done prints out for each $line an empty line and then bash: xxxx: command not found. However I would expect that it prints out just xxxLacefield
Instead of all the "Edits" notes and strikeovers (that is what the revision history is for), it would be better to have it as if this answer was written today. If there are some lessons to be learned it could be documented in a section, e.g. "Things not to do".Stewart
@Cadoiz Yes, there was some typos... read _ to drop not only skip... and so... Answer edited, added link to CSV parser sample. Thanks!Downes
echo '4*a(1)' | bc -l why it is PI. I do search on google, but not get good answer.Hypocrite
@Mark see man bc. with -l switch, a() is arctangent ( arctangent(1) = ¶ / 4 )Downes
thanks. @F.Hauri I finally found out in here pubs.opengroup.org/onlinepubs/009695399/utilities/bc.htmlHypocrite
@F.Hauri ps ho lstart 1 So far, I know ps, lstart, but I don't know the other 2 symbol...Hypocrite
M
82

As they have already indicated to you, you should use `backticks`.

The alternative proposed $(command) works as well, and it also easier to read, but note that it is valid only with Bash or KornShell (and shells derived from those), so if your scripts have to be really portable on various Unix systems, you should prefer the old backticks notation.

Mariehamn answered 11/1, 2011 at 22:14 Comment(7)
They are overtly cautious. Backticks have been deprecated by POSIX a long time ago; the more modern syntax should be available in most shells from this millennium. (There are still legacy environments coughHP-UXcough which are stuck firmly in the early nineties.)Soelch
Incorrect. $() is fully compatible with POSIX sh, as standardized over two decades ago.Berners
Note that /bin/sh on Solaris 10 still does not recognize $(…) — and AFAIK that's true on Solaris 11 too.Animalism
@JonathanLeffler It is actually no more the case with Solaris 11 where /bin/sh is ksh93.Hershel
@Soelch - response three years late :-) but I've used $() in the POSIX shell on HP-UX for the past 10+ years.Middleoftheroad
Good to hear! A number of vendor-specific Unices (HP, AIX, SunOS/Solaris) used to be notorious for having extremely quirky userspace utilities with legacy behaviors which prevented any attempt at writing portable code; if HP (and Solaris!) are now finally out of there, the world is a better place.Soelch
just wanted to add that $(`command`) will interpret the commands output as a commandDerogative
U
55

I know three ways to do it:

  1. Functions are suitable for such tasks:**

    func (){
        ls -l
    }
    

    Invoke it by saying func.

  2. Also another suitable solution could be eval:

    var="ls -l"
    eval $var
    
  3. The third one is using variables directly:

    var=$(ls -l)
    
        OR
    
    var=`ls -l`
    

You can get the output of the third solution in a good way:

echo "$var"

And also in a nasty way:

echo $var
Uhland answered 13/2, 2014 at 7:31 Comment(3)
The first two do not seem to answer the question as it currently stands, and the second is commonly held to be dubious.Soelch
As someone who is entirely new to bash, why is "$var" good and $var nasty?Prescience
@Prescience #10067766Soelch
S
34

Just to be different:

MOREF=$(sudo run command against $VAR1 | grep name | cut -c7-)
Shreveport answered 10/1, 2011 at 21:7 Comment(0)
G
34

When setting a variable make sure you have no spaces before and/or after the = sign. I literally spent an hour trying to figure this out, trying all kinds of solutions! This is not cool.

Correct:

WTFF=`echo "stuff"`
echo "Example: $WTFF"

Will Fail with error "stuff: not found" or similar

WTFF= `echo "stuff"`
echo "Example: $WTFF"
Gytle answered 18/7, 2017 at 11:42 Comment(2)
The version with the space means something different: var=value somecommand runs somecommand with var in its environment having the value value. Thus, var= somecommand is exporting var in the environment of somecommand with an empty (zero-byte) value.Berners
Yes, a Bash gotcha.Stewart
B
16

If you want to do it with multiline/multiple command/s then you can do this:

output=$( bash <<EOF
# Multiline/multiple command/s
EOF
)

Or:

output=$(
# Multiline/multiple command/s
)

Example:

#!/bin/bash
output="$( bash <<EOF
echo first
echo second
echo third
EOF
)"
echo "$output"

Output:

first
second
third

Using heredoc, you can simplify things pretty easily by breaking down your long single line code into a multiline one. Another example:

output="$( ssh -p $port $user@$domain <<EOF
# Breakdown your long ssh command into multiline here.
EOF
)"
Broomrape answered 1/6, 2015 at 17:38 Comment(15)
What's with the second bash inside the command substitution? You are already creating a subshell by the command substitution itself. If you want to put multiple commands, just separate them by newline or semicolon. output=$(echo first; echo second; ...)Soelch
Then similarly 'bash -c "bash -c \"bash -c ...\""' would be "different", too; but I don't see the point of that.Soelch
@Soelch heredoc means something more than that. You can do the same with some other commands like ssh sudo -s executing mysql commands inside, etc.. (instead of bash)Broomrape
I don't feel we are communicating properly. I am challenging the usefulness over variable=$(bash -c 'echo "foo"; echo "bar"') over variable=$(echo "foo"; echo "bar") -- the here document is just a quoting mechanism and doesn't really add anything except another useless complication.Soelch
@Soelch that's right if you think only of those cases. Add other things like executing multiple sudo commands or ssh commands, I am sure you will see the differences. For example sudo commad;sudo command; isn't helpful at all.Broomrape
But then you'd wrap that in sudo bash -c '...' in which the bash is useful precisely for that reason.Soelch
@Soelch heredoc is safer than bash -c. Not to mention all the different quotes it requires. It makes things very confusing and it's pretty easy to mess things up with bash -c. And I generally try to avoid using bash -c. It's not safe.Broomrape
Then don't. You still have not explained how variable=$(bash <<<"echo 'moo'; echo 'bar'") is useful compared to variable=$(echo 'moo'; echo 'bar') and I can't help but point out that the former, too, introduces various quoting complications.Soelch
@Soelch the bash is just an example. The main thing is the use of heredoc. For example you can get the output of a long ssh command by dividing it into a multi-line command with heredoc var=$(ssh <<EOF...). It simplifies the command a lot compared to bash -c or putting all in the same line.Broomrape
Yes, but that's still apples and oranges. ssh adds something (a secure remote connection) while bash does not. The here document is not necessary for breaking the command over multiple lines. Random demo: ideone.com/VbVinBSoelch
@Soelch that's apples and orranges, run a multiline ssh command with single login like that. Let's see how you do it.. I would be happy to know such a method easier than using heredoc.Broomrape
You can use a multiline string as the argument to ssh which has some benefits, such as not tying up standard input for reading commands. I continue to fail to see any positive relevance to the actual question.Soelch
See e.g. https://mcmap.net/q/11750/-what-is-the-cleanest-way-to-ssh-and-run-multiple-commands-in-bash although many of the other answers (including the accepted answer) inexplicably prefer here documents.Soelch
@Soelch yap, using a string would do too. But I lik the heredoc better. It feels simpler than putting commands in string.Broomrape
When I use heredoc with ssh, I precise the command to run ssh -p $port $user@$domain /bin/bash <<EOF in order to prevent Pseudo-terminal will not be allocated because stdin is not a terminal. warningDownes
T
12

You need to use either

$(command-here)

or

`command-here`

Example

#!/bin/bash

VAR1="$1"
VAR2="$2"

MOREF="$(sudo run command against "$VAR1" | grep name | cut -c7-)"

echo "$MOREF"
Thoughtless answered 28/12, 2017 at 23:44 Comment(2)
$() is much better than backticks. See: What is the benefit of using $() instead of backticks in shell scripts?Oatcake
I didn't know you could nest but it makes perfect sense, thank you very much for the info!Thoughtless
B
11

If the command that you are trying to execute fails, it would write the output onto the error stream and would then be printed out to the console.

To avoid it, you must redirect the error stream:

result=$(ls -l something_that_does_not_exist 2>&1)
Bhopal answered 24/6, 2019 at 10:35 Comment(0)
C
7

Mac/OSX nowadays come with old Bash versions, ie GNU bash, version 3.2.57(1)-release (arm64-apple-darwin21). In this case, one can use:

new_variable="$(some_command)"

A concrete example:

newvar="$(echo $var | tr -d '123')"

Note the (), instead of the usual {} in Bash 4.

Coition answered 3/10, 2022 at 12:11 Comment(0)
T
6

This is another way and is good to use with some text editors that are unable to correctly highlight every intricate code you create:

read -r -d '' str < <(cat somefile.txt)
echo "${#str}"
echo "$str"
Trod answered 10/5, 2015 at 21:44 Comment(1)
This doesn't deal with OP's question, which is really about command substitution, not process substitution.Oatcake
S
5

You can use backticks (also known as accent graves) or $().

Like:

OUTPUT=$(x+2);
OUTPUT=`x+2`;

Both have the same effect. But OUTPUT=$(x+2) is more readable and the latest one.

Swane answered 9/2, 2016 at 8:3 Comment(2)
Parenthesis was implemented in order to permit nesting.Downes
x+2 is not a valid command, most places. To the extent that this isn't misleading beginners to think this is how you do arithmetic, this duplicates existing answers.Soelch
C
5

Here are two more ways:

Please keep in mind that space is very important in Bash. So, if you want your command to run, use as is without introducing any more spaces.

  1. The following assigns harshil to L and then prints it

    L=$"harshil"
    echo "$L"
    
  2. The following assigns the output of the command tr to L2. tr is being operated on another variable, L1.

    L2=$(echo "$L1" | tr [:upper:] [:lower:])
    
Campanulaceous answered 22/6, 2016 at 10:9 Comment(2)
1. $"..." probably doesn't do what you think it does. 2. This is already given in Andy Lester's answer.Propylene
@Propylene is right: see bash localization won't work with multilines. But under bash, you could use echo ${L1,,} to downcase, or echo ${L1^^} to upcase.Downes
H
4

Some may find this useful. Integer values in variable substitution, where the trick is using $(()) double brackets:

N=3
M=3
COUNT=$N-1
ARR[0]=3
ARR[1]=2
ARR[2]=4
ARR[3]=1

while (( COUNT < ${#ARR[@]} ))
do
  ARR[$COUNT]=$((ARR[COUNT]*M))
  (( COUNT=$COUNT+$N ))
done
Hara answered 22/11, 2015 at 11:59 Comment(4)
This does not seem to have any relevance for this question. It would be a reasonable answer if somebody were to ask how to multiply a number in an array by a constant factor, though I don't recall ever seeing anyone asking that (and then a for ((...)) loop would seem like a better match for the loop variable). Also, you should not use uppercase for your private variables.Soelch
I disagree with the "relevance" part. The question clearly reads: How to set a variable equal to the output from a command in Bash? And I added this answer as a complement because I got here looking for a solution which helped me with the code I later posted. Regarding the uppercase vars, thanks for that.Hara
This could be written ARR=(3 2 4 1);for((N=3,M=3,COUNT=N-1;COUNT < ${#ARR[@]};ARR[COUNT]*=M,COUNT+=N)){ :;} but I agree with @tripleee: I don't understand what do this, there!Downes
@F.Hauri... bash is getting more & more like perl the deeper you go into it!Diahann
N
0

If you use bash: For the case you have multiple commands via pipes and need both their results and their exit statuses there is another great solution. It should be clear from the following example:

# This saves all the results into the RES variable
grep -E "\S" "file.txt" | sort | uniq | read -d '' RES
# 'read' exit status 1 means all input was read till EOF, we're OK with that
if (( PIPESTATUS[0] > 1 || PIPESTATUS[1] > 0 || PIPESTATUS[2] > 0 || PIPESTATUS[3] > 1 )); then
  echo "ERROR"
else
  echo "$RES"
fi

In order for this to work you need to enable shopt -s lastpipe (works just for bash AFAIK!), plus if you're in an interactive session you need to disable job control via set +m (not needed in scripts by default).

See this post in another thread for more details.

Nude answered 3/3, 2023 at 11:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.