Automator, Hazel, Alfred

Hello,
I am running MacOS Monterey.
What I'm trying to do is to create a cronjob action which replicates my actions every N minutes on my Mac.
I watched Automator, Hazel and Alfred tutorials and it seems like the Automator is the best match.
Under application field in Automator, I did only:

  1. Open Safari (ok)
  2. Go to url (ok)

What I can not do:

  1. Run website downloader addon on Safari (?)
    (Single html downloader addon)
    (Since website requires username-password, I am unable to download the content from command line but just via web browser)
  2. Run shell script from terminal (?)
    (I suppose, I need to manually add a new script to process flow [from utility section] and paste my codes into that field)
  3. Quit the app
  4. Repeat all those actions every N minutes
    (Loop function under utility -> loop automatically, ? )
    I appreciate your recommendations.

Thanks
Boris

Maybe ditch Safari and use something link Lynx.

1 Like

or wget/curl/lftp/etc...

1 Like

Thank you Both,
I need to understand how "dump with cookie" works where you explained here
This solution may differ depending on the code of target url.
Lynx with simple dump method returns nothing

it is not clear what you are actually doing, but take a read at the expect man page , built for interaction between programs it may fit.

Hello @munkeHoller,
I am trying to download a website from terminal where login process requires captcha confirmation.
Now I am reading expect manual as you leaded.

Thank you
Boris