Help with validation of URLs in shell script

hello every one

I wrote the script to validate all the urls from my server . sourcing text file which contains the list of urls and trying to check every url and based on response printing url is up or not. the issue when i use the below code with url is printing url exist and and in loop it's giving all services not exits . can any one help

#!/bin/bash
source=/root/scripts/url.properties
#sed -i '/^$/d' $urlfile; #To Parse the URLFILE for removal of blank rows
while IFS='=' read -r service url
do
   # echo "$service"
   # echo "$url"

     if curl -o /dev/null --silent --head --fail "$url"; then
   # if  curl -o /dev/null --silent --head --write-out '%{http_code}' "$url";then
      echo  "$service:  is up"
    else
      echo "$service:  is not responding."
    fi
done <url.properties

What does "giving all services not exits" mean?

I have not seen the scenario where all services are not running, manually i have checked from one server as all services are up :slight_smile:
Need to check the same from multiple servers so thought of automating it .

right now i am checking the below sample urls from the server
urls not able to post as it;s giving error i have only 2 posts not able to put links in this reply

Please become accustomed to provide decent context info of your problem.

It is always helpful to carefully and detailedly phrase a request, and to support it with system info like OS and shell, related environment (variables, options), preferred tools, adequate (representative) sample input and desired output data and the logics connecting the two including your own attempts at a solution, and, if existent, system (error) messages verbatim, to avoid ambiguities and keep people from guessing.

1 Like