The original post: /r/datahoarder by /u/deekaph on 2024-11-21 22:31:38.
Thought I’d share a little script I put together for scraping videos from webpages. Two scripts actually, I keep two terminal windows open with one running "gather.sh", this awaits input (copy the address bar of pages you want to scrape and paste it into here), and "download.sh" which will download 4 files at a time from the queue you’re creating with the gather script. You can modify the code to do as many dl’s at once as you want, at risk of getting throttled/blocked by the sites.
Just uses common linux utils so nothing fancy to install, just git them or copy the code and paste it into a file, chmod +x it and bob’s your uncle.
It’s pretty simple and certainly won’t work everywhere, it’s not really meant to, but I hope someone finds it useful in archiving :)