Knowledgebase

Portal Home > Knowledgebase > Articles Database > Auto download/ checker scrip


Auto download/ checker scrip




Posted by Wingsgb, 07-11-2012, 01:03 PM
Hi all, Basically there is a new game due out soon and the official website have put up a new page with the download button and link but obviously no client yet as its not yet available. Cause of the time zone this will most probably be released when i am tucked up inbed so im after some sort of script that will auto refresh/ check the link every 5min and auto start the download when available. dose anyone know of anything out there? or were i start in creating something like this. Im familiar with HTML/XHTML,CSS and very basic PHP. thanks

Posted by WhIteSidE, 07-13-2012, 07:21 PM
You can almost certainly build a script like this using simple PHP (see curl) and automate it with cron. HOWEVER, this sort of high frequency refreshing of the page (or "attack") is generally frowned upon by websites, and probably violates the site's terms of service, so I would strongly suggest against such a course of action.

Posted by JustinAY, 07-14-2012, 01:50 AM
I find it odd they would put up a link to a 404 file. Most likely, I'm thinking it's redirecting some place else which will mess this up. However, if it's as you've stated, the below will get it done. The below will work if the link returns back a successful page download. (ie http status 200). It fails on 404, etc. #!/bin/bash # Put the URL to the site. url=google.com #Put the filename of the certain file. filename=index.html downloadstuff () { wget $url/$filename if [ $? = 0 ]; then exit 0 else sleep 300 downloadstuff fi } downloadstuff If the link 404s, the script will keep running every 5 minutes (300 seconds). Below will attempt to download the certain file, but if that file name does not exist after the attempted download, will try again in 5 minutes. #!/bin/bash url=google.com filename=index.html downloadstuff () { wget $url/$filename if [ -e $filename ]; then exit 0 else sleep 300 downloadstuff fi } downloadstuff

Posted by JustinAY, 07-14-2012, 01:52 AM
BTW, that is bash. You can run it from a linux shell or setup a cronjob if your host allows you to run bash scripts.

Posted by josephgarbett, 07-14-2012, 02:57 PM
You can use curl, and can use the Proxies set opt, while utilising xpath to work out the download link. Have a predefined array of Ip and ports it can use and each time you run, select a random item in the array and use it That's what I do, very effective and prevents sites blocking you



Was this answer helpful?

Add to Favourites Add to Favourites    Print this Article Print this Article

Also Read
MSSQL Backup Location (Views: 565)


Language:

Contact us