I don't really know anything about scripting, but I have to manage to make this script, out of necessity.
#!/bin/bash
while read -r line; do #I'm reading from a big wordlist
instructions using $line
done
Is there a way to automatically write the $line number the script is at into a separate textfile and read from the line number so that I won't have to start everything from scratch in case I have to stop the script?
#!/bin/bash
CPT=0
while read -r LINE #I'm reading from a big wordlist
do
CPT=$((CPT+1))
instructions using $LINE
echo "CPT="$CPT > ~/interrupted_at_line.log #will be in your $HOME
done
---------- Post updated at 11:32 AM ---------- Previous update was at 11:23 AM ----------
Somebody helped me out (Santhosh). I'm going to copy paste the code here just in case someone needs it in the future.
#!/bin/bash
TMP_FILE="/tmp/currentLineNumber" # a constant
current_line_count=0 # track the current line number
processed_lines_count=0
# Verify if we have already processed some stuff.
if [ -r "${TMP_FILE}" ]; then
processed_lines_count=$(cat ${TMP_FILE})
fi
while read -r line; do # I 'm reading from a big wordlist
# Skip processing till we reach the line that needs to be processed.
if [ $current_line_count -le $processed_lines_count ]; then
# do nothing as this line has already been processed
current_line_count=$((current_line_count+1)) # increment the counter
continue
fi
current_line_count=$((current_line_count+1))
echo $current_line_count > ${TMP_FILE} # cache the line number
# perform your operations
command1 using $line
command2 using $line
done