Hello everyone,
I would like some advice on how to archive a large number of download links from the https://uloz.to website.
It’s a public file sharing website which unfortunately disables searching between their files and then it should only be possible to download if you have a link to the file you want to download.
Is there any way to back up the links to the files that are in their files? Is there a script that can go through the files one by one and write them to an excel file?
Thank you for your helpour local file sharing website is supposedly turning off search
You must log in or register to comment.