TransWikia.com

How to assign environment variables in parallel in bash

Unix & Linux Asked by kag0 on November 30, 2020

I’m trying to set several environment variables with the results from command substitution. I want to run the commands in parallel with & and wait. What I’ve got currently looks something like

export foo=`somecommand bar` &
export fizz=`somecommand baz` &
export rick=`somecommand morty` &
wait

But apparently when using & variable assignments don’t stick. So after the wait, all those variables are unassigned.

How can I assign these variables in parallel?

UPDATE: Here’s what I ended up using based off the accepted answer

declare -a data
declare -a output
declare -a processes

var_names=(
    foo
    fizz
    rick
)

for name in "${var_names[@]}"
do
    processes+=("./get_me_a_value_for $name")
done

index=0
for process in "${processes[@]}"; do
    output+=("$(mktemp)")
    ${process} > ${output[$index]} &
    index=$((index+1))
done
wait

index=0
for out in "${output[@]}"; do
    val="$(<"${out}")"
    rm -f "${out}"

    export ${var_names[index]}="$val"

    index=$((index+1))
done

unset data
unset output
unset processes

2 Answers

After some ruminations, I came up with an ugly workaround:

#!/bin/bash
proc1=$(mktemp)
proc2=$(mktemp)
proc3=$(mktemp)

/path/to/longprocess1 > "$proc1" &
pid1=$!
/path/to/longprocess2 > "$proc2" &
pid2=$!
/path/to/longprocess3 > "$proc3" &
pid3=$!

wait "$pid1" "$pid2" "$pid3"
export var1="<("$proc1")"
export var2="<("$proc2")"
export var3="<("$proc3")"
rm -f "$proc1" "$proc2" "$proc3"

As requested in a comment, here is how to make this more extensible for an arbitrarily large list:

#!/bin/bash
declare -a pids
declare -a data
declare -a output
declare -a processes

# Generate the list of processes for demonstrative purposes
processes+=("/path/to/longprocess1")
processes+=("/path/to/longprocess2")
processes+=("/path/to/longprocess3")

index=0
for process in "${processes[@]}"; do
    output+=("$(mktemp")
    $process > ${output[$index]} &
    pids+=("$!")
    index=$((index+1))
done
wait ${pids[@]}
index=0
for process in "${processes[@]}"; do
    data+="$(<"${output[index]}")"
    rm -f "${output[index]}"
    index=$((index+1))
done
export data

The resultant output will be in the data array.

Correct answer by DopeGhoti on November 30, 2020

If you have more jobs than can safely be run in parallel at the same time, you can use parset from GNU Parallel:

parset foo,fizz,rick somecommand ::: bar baz morty
export foo
export fizz
export rick

See details: https://www.gnu.org/software/parallel/parset.html

Answered by Ole Tange on November 30, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP