I have a set of urls that are like this:
[url removed, login to view]
(which redirects to [url removed, login to view])
The valid values for file are not incremental. I have a list (in the millions) of valid ones. I can prepare the list as a text file, sqlite, whatever we want. The idea is to start the script in a threaded manner, download 10,000 files, zip them, download another 10,000, zip them, and so on until the list is exhausted.
Want to get this done asap.
23 freelancere byder i gennemsnit $30/time for dette job
Hello, shouldn't be a problem. What will be done with the urls that failed to download? Written to another file/database/whatever to try later? Any authentication necessary? KR, Oliver