Auto-download certain URLs, hourly?



  • I just might be able to figure this one out with a lot of trial and error, but I'm hoping the good people here can save me a good deal of effort. šŸ™‚

    I want to be able to set up my Omega2+ so that, once an hour, it creates a directory named for the date-and-time, cd's into it, and then runs wget to download local copies of a few hundred short files. (RSS feeds, if you're curious). I /almost/ know enough scripting to be able to do this on a Linux laptop; but the Omega's limited toolset makes it just enough trickier that I'd appreciate some help sorting out the details.



  • @Daniel-Boese Something along the lines of should do:

    #!/bin/sh
    while true; do
    datetime=$(date +%d-%m-%y--%H-%M)
    mkdir -p $datetime
    if [ $? -eq 0 ]; then
    cd $datetime
    wget http://whatever.url/some.file
    cd ..
    fi
    sleep 1h
    done
    

    Note that it doesn't do any error-checking or anything. Add error-checking and whatnot, however it suits your needs.


Log in to reply
 

Looks like your connection to Community was lost, please wait while we try to reconnect.