The curl stream output is read line by line and contains conditions
I have an API URL that allows monitoring certain events. I can curl “https://theurl.events/logs” with a simple code All logs are in text format and it never ends, so I just run the curl
command and leave it there.
Now I want to set some conditions if the log contains keyword
then do something.
The log looks like this, it looks like json, but it’s text and not json
action=machinestarted,
data={
"location": "Place A"
"lag": "033"
"size": "5543"
"id": "11",
.....
}
action=Error,
data={
"location": "Place B"
"lag": "033"
"size": "5543"
"id": "11",
.....
}
So far, I can filter the log grep error |./runbash.sh by curl "https://theurl.events/logs"2>&1 |
As the number of events increases, I want to grep more keywords, eg. grep WrongOperation
, grep WrongButton
Then I want to run a different bash file.
I don’t think it’s a good idea to run them separately, eg
"https://theurl.events/logs" 2>&1 | grep Error` | ./runbash1.sh
"https://theurl.events/logs" 2>&1 | grep WrongOperation` | ./runbash2.sh
"https://theurl.events/logs" 2>&1 | grep WrongButton` | ./runbash3.sh
So I’m wondering if it’s possible to use the output of while
loop curl and include multiple conditions, like
while IFS= read -r line (from curl)
do
if [[ "$line" == *"WrongOperation"* ]]; then
do something
elif
[[ "$line" == *"WrongButton"* ]]
//.....
done
Solution
No while
+ read
loops, things like that.
#!/usr/bin/env bash
output=$(
curl "https://theurl.events/logs" 2>&1 |
grep -E 'Error| WrongOperation| WronButton'
)
printf '%s\n' "$output"
if [[ $output =~ Error ]]; then
echo ./runbash1.sh
elif [[ $output =~ WrongOperation ]]; then
echo ./runbash2.sh
elif [[ $output =~ WronButton ]]; then
echo ./runbash3.sh
fi
If you are satisfied with the output, remove the echo
.