Multithreaded programming using Bash scripts… here is a solution to the problem.
Multithreaded programming using Bash scripts
I’m running a bash script like this:
for i in {0..3250000..50000}
do
wget "http://xxx/select?q=*:*&row_size=50000&start=$i" -O $i.csv
done
Every time I send a request, I have to wait for it to finish and write to the file before continuing the loop. But I want to do it asynchronously. I mean it will send the request and loop without waiting for a response. However, when the response comes, it does the right thing.
What should I do?
Solution
You can use xargs
:
printf '%s\0' {0..50000..3250000} |
xargs -0 -I {} -n 1 -P 20 \
wget 'http://xxx/select?q=*:*&row_size=50000&start={}' -O {}.csv
-0
selects the NULL character as the delimiter, and -I
{} replaces {}
-n 1
passes a parameter to wget
and -P 20
Process 20 requests in parallel at a time.
Alternatively, you can attach &
to the command line to execute it in the background and wait for
the process to complete.