Small script for wget

Hello, All

I'm trying to curl some audio files.
Using wget:
wget //http.server.name/audio/file/category/
gives me error 404

However, If I set the command for a specific .mp3 file, the download occurs flawless. e.g.
wget //http.server.name/audio/file/category/p001.mp3

The audio files are in order numerically from 318 to 1070.

How can I script something to run the wget command in a loop, changing the file name but advancing the number +1 and starting the next download?
Sorry for the noob question, if someone could send me to a tutorial or a sample code I would be stoked.

Ty

Welcome to the forums.

We encourage user to post their own attempts and use search.
Have you tried anything ?

Regards
Peasant.

1 Like

Hello, Yes

I have a vague idea of how to write the loop. I've made one for handbrake in the past and I think I reuse most of it. I need some documentation on the variable(not sure that's what it's called) needed to change the file name by +1.

I you could give me the proper name for what I'm trying to do, I could google foo and take a wack at writing something. Right now, I'd just be fuddling around because I'm not quite sure what to search.

Ty!

how about this search and this example find.

#!/bin/bash

typeset -i s=5
typeset -i e=12

for (( i=s; i<=e; ++i ))
do
   printf "[%d]\n" "${i}"
done

OK here we go boys


typeset -i s=318
typeset -i e=1479

for (( i=s; i<=e; ++i ))
do
 wget '
-P /mnt/local/folder
--limit-rate 1000k 
-w 2 
http://random.host.com/audio/"${i}".mp3 
&&'
done

This is working like a charm.
It's running in the background, I set the limits to fly below the error & throttling triggers on the remote server. It's humming right allong.

Thanks for all the help.

Strange it's working with single quotes ....

It didn't. I had to work with it more, this is what I have now.


typeset -i s=333
typeset -i e=1479

for (( i=s; i<=e; ++i ))

do

        wget \
        -P /mnt/local/wget/audio \
        --limit-rate 1000k \
        -w 2 \
        http://traffic.server.com/podcast.name/p"${i}".mp3 \
        & \
        /

done

It's working, kinda....

I need it to start and run in the background.
I've been experimenting. Right now, It starts the process in the background, however all the processes start at once as apposed to 1 at a time. This is problematic. I need the whole thing to download 1 ep at a time in the background....

Any pointers??

Then loose the trailing @ for each wget in the script and run the whole script in the background: myScript.sh &.
Unless I'm missing something basing in what you want...

It is possible to send a ( subshell ) to the background:

typeset -i s=333
typeset -i e=1479

# subshell starts ...
(
for (( i=s; i<=e; ++i ))
do

        wget \
        -P /mnt/local/wget/audio \
        --limit-rate 1000k \
        -w 2 \
        http://traffic.server.com/podcast.name/p"${i}".mp3 \
        /

done
) &
# ... ends, and is sent to the background
3 Likes

Ok, Ok, I'm seeing what you got going there. If I change the script and run it
./script &
I can close the session and the process will continue to run without me.

I'll give it a shot.

Remove the @?? I don't have an @ that I'm aware of.

When I start the script.
./script &
The script itself will run in the background. However, the wget process it starts is tied to the session. When I close the putty session the whole thing stops. When I add the & to the end of the script, the script and the wget start in the background, however, it starts all 1k+ processes at the same time. ...

This is quite expected when using & at the end of wget command.

This will start the wget for a loop member, put in background, immediately start another etc.
Causing 1k+ process in background.

If && is used, it will wait till one wget finished, if it finished sucesfully (exit code is 0), it will spawn another and so on.

Spawning multiple process in parallel will require much more code to run safely.

Regards
Peasant.

I'm sorry, but to clarify, I need to end the wget command with &&

#!/bin/bash
typeset -i s=333
typeset -i e=1479

for (( i=s; i<=e; ++i ))

do

        wget \
        -P /mnt/local/wget/audio \
        --limit-rate 1000k \
        -w 2 \
        http://traffic.server.com/podcast.name/p"${i}".mp3 \
        && \
        /

done  

and run the script, ./script &, I should be in business. or No?

On Linux systems ./script & disown
You are free to close the session then.

Other alternatives are using screen or tmux to run script, and of course, nohup

Regards
Peasant.

2 Likes

I'm still not having success.

typeset -i s=333
typeset -i e=1479

for (( i=s; i<=e; ++i ))

do

        wget \
        -P /mnt/local/wget/audio \
        --limit-rate 1000k \
        -w 2 \
        http://traffic.server.com/podcast.name/p"${i}".mp3 \
        && \
        /

done 

"
This is not working. the above is starting multiple downloads at at the same time and the process stops when I close the session the downloads stop. I have no clue what to do now. I started the process with
./script & disown and it made no difference.

I changed it up. This started all 1000+ downloads concurrent. When started with,
./script & disown


typeset -i s=331
typeset -i e=1479

(
for (( i=s; i<=e; ++i ))

do
        wget  '-P /mnt/local/wget/audio \
        --limit-rate 1000k \
        -w 2 \
        http://rando-site.com/audio/p"${i}".mp3' /
done
) &

typeset -i s=552
typeset -i e=1479

for (( i=s; i<=e; ++i ))

do

        wget -P /mnt/Local/pods --limit-rate=1000k -w 2 http://traffic/p"${i}".mp3 && disown

done

"
Then I started the script with ./script & disown

the script is laying down quite flat.

Gracias, Amigos

Thanks for showing the solution.
I guess it must be wget ... & disown i. e. one ampersand.?

I'll be direct with you. The code is functional, but I don't think the code would stand up to peer review. lol

I had to trial 'n' error this together.

wget ... & disown starts the next DL in the loop before the previous DL ends. The script starts (in my case) 1k wget processes witch is no bueno for a myriad of reasons. However, this does divorce the wget processes from the session. You can close out your ssh session and all 1k wget processes will keep trucking allong.

wget ... && disown retains loop functionality. The script loops and starts the wget process in succession. This also starts the process in the background divorced from the ssh session.

./script & disown Runs the script in the background and divorced from the ssh session. So, using this command to start the script and having the && disown in the script. keeps everything running in the background on autopilot.

Setting the rate limit to 1000 kbps and putting a 2 sec. wait between wget processes avoids the rate limit filters on the host.

I've had it running for 18+ hours, no hiccups.

I had to start and stop a bunch of times. I did a good amount of travel in June.

Then I wonder if the && disown does anything at all (but confusing me).