0

I need 3 commands to be run and their (single-line) outputs assigned to 3 different variables, which then I use to write to a file. I want to wait till the variable assignment is complete for all 3 before I echo the variables to the file. I am running these in a loop within a bash script.

This is what I have tried -

var1=$(longRunningCommand1) &
var2=$(longRunningCommand2) &
var3=$(longRunningCommand3) &

wait %1 %2 %3
echo "$var1,$var2,$var3">>$logFile

This gives no values at all, for the variables. I get -

,,
,,
,,

However, if I try this -

var1=$(longRunningCommand1 &)
var2=$(longRunningCommand2 &)
var3=$(longRunningCommand3 &)

wait %1 %2 %3
echo "$var1,$var2,$var3">>$logFile

I get the desired output,

o/p of longRunningCommand1, o/p of longRunningCommand2, o/p of longRunningCommand3
o/p of longRunningCommand1, o/p of longRunningCommand2, o/p of longRunningCommand3
o/p of longRunningCommand1, o/p of longRunningCommand2, o/p of longRunningCommand3

but the nohup.out for this shell script indicates that there was no background job to wait for -

netmon.sh: line 35: wait: %1: no such job
netmon.sh: line 35: wait: %2: no such job
netmon.sh: line 35: wait: %3: no such job

I would not have bothered much about this, but I definitely need to make sure that my script is waiting for all the 3 variables to be assigned before attempting the write. Whereas, the nohup.out tells me otherwise! I think I want to know if the 2nd approach is the right way when I run into a situation where any of those 3 commands are running for more than a few seconds. I have not yet been able to get a really long running command or a resource contention on the box to actually resolve this doubt of mine.

Thank you very much for any helpful thoughts.

-MT

3
  • Why not ( (longRunningCommand1) >> $logfile; (longRunningCommand2) >> $logfile; (longRunningCommand3) >> $logfile) & then if you want to wait until the writing is done wait $!. (but there is no need to wait at this point as each process with complete the write) Commented Feb 14, 2018 at 23:54
  • 1
    There is no way to assign variables in the background. var=$(cmd &) does not work the way you might think, it waits for the command to finish before moving on to the next. You can have your commands write to files, wait for them to finish, and then read back data from the files in the foreground. Commented Feb 15, 2018 at 0:55
  • Yes -- it will, notice the ';' between the commands, from man bash (and I don't know of any variations here between the shells) "Commands separated by a ; are executed sequentially; the shell waits for each command to terminate in turn." Commented Feb 15, 2018 at 3:45

1 Answer 1

1

Your goal of writing the output of echo "$var1,$var2,$var3">>$logFile while backgrounding actual processes of longRunningCommand1, ..2, ..3 can be accomplished using a list and redirection. As @that_other_guy notes, you cannot assign the result of a command substitution to a variable in the background to begin with. However, for a shell that provides process substitution like bash, you can write the output of a process to a file in the background and separating your processes and redirections by a ';' will insure the sequential write of command1, ..2, ..3 to the log file, e.g.:

Commands that are separated by a <semicolon> ( ';' ) 
shall be executed sequentially.

POSIX Specification - lists

Putting those pieces together, you would sequentially write the results of your comment to $logfile with something similar to the following,

( (longRunningCommand1) >> $logfile; (longRunningCommand2) >> $logfile; \
  (longRunningCommand3) >> $logfile) &

(note: the ';' between commands writing to $logfile)

While not required, if you wanted to wait until all commands had been written to $logfile within your script (and your script supports $! as the PID for the last backgrouded process), you could simply wait $!, though that is not required to insure the write to the file completes.

Sign up to request clarification or add additional context in comments.

4 Comments

Thank you David. I will read up on process substitution and use it to try solve this problem. I may take the easy way out of running the 3 commands as asynchronous lists, writing to their own temp files, and then reading those to write sequentially to the final output file using ;.
That works too, but since POSIX defines the sequential execution of commands separated by ';' you can really just do ( longRunningCommand1 >> $logfile; ... ) & and you are guaranteed the behavior you want without relying on process substitution. That was just included as a counterpart to your command substitution.
Thank you. My emphasis on background execution is because I want to run all the 3 longRunningCommand in parallel. They are each actually a piped combination of several commands and all 3 sets need to start at the same instant. And then the results need to be written sequentially to the $logFile. P.S. - Deleted an earlier comment of mine where I mistook ; with parallel execution. To avoid confusion for other readers.
Then I like your temp file idea :)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.