building a more resilient https downloader? - Printable Version +- QB64 Phoenix Edition (https://qb64phoenix.com/forum) +-- Forum: QB64 Rising (https://qb64phoenix.com/forum/forumdisplay.php?fid=1) +--- Forum: Code and Stuff (https://qb64phoenix.com/forum/forumdisplay.php?fid=3) +---- Forum: Help Me! (https://qb64phoenix.com/forum/forumdisplay.php?fid=10) +---- Thread: building a more resilient https downloader? (/showthread.php?tid=3340) |
building a more resilient https downloader? - madscijr - 01-03-2025 When I try to download certain large files from certain sites using a web browser, the download times out (it just says Failed - network error, or Couldn't download - Network issue). I can usually click Resume and it will pick up where it left off and finish the download, but sometimes it fails more than once, and it's damn annoying having to manually restart it. I'm wondering how one might build a basic https browser and file downloader that if it detects the download timed out, keeps trying to reconnect and automatically resumes the download? RE: building a more resilient https downloader? - SpriggsySpriggs - 01-03-2025 Oh I ain't even going to attempt to make a web browser. Not in v4.0, that's for sure. A downloader? Sure. I think I got rid of my code, though. I've got several different HTTPS downloaders. Even one that only uses QB64 functions. RE: building a more resilient https downloader? - madscijr - 01-03-2025 (01-03-2025, 12:14 PM)SpriggsySpriggs Wrote: Oh I ain't even going to attempt to make a web browser. Not in v4.0, that's for sure. A downloader? Sure. I think I got rid of my code, though. I've got several different HTTPS downloaders. Even one that only uses QB64 functions.Cool! If you come across it, and are feeling generous, that would be great. I agree that making a Web browser is a Pandora's box nobody wants to casually open - even Google and Microsoft have to work hard to keep theirs up to date and bug free! The reason I included Web browser in the functionality is that some sites display a file link in your browser, but if you try the link in a different browser it doesn't work, so I'm guessing the link is specific to the ID of the browser? Maybe a cookie or something? No idea. Maybe if the downloader can somehow identify itself as an existing Web browser that might work? (BTW I am so far removed from Web and networking and how that all works, I could be talking out my ass and wouldn't know it, LoL!) RE: building a more resilient https downloader? - Pete - 01-03-2025 I remember talking out of my ass once, many years ago... Haven't been able to navigate back, since. I did some of this stuff maybe 15 years back with wget and curl. I don't know if wget had it back then, but StackOverflow has an article about this and this quote: Quote:Wget has been designed for robustness over slow or unstable network connections; https://stackoverflow.com/questions/19728930/how-to-resume-interrupted-download-automatically-in-curl It might be worth a look, since wget is very easy to shell from Qb64. Pete RE: building a more resilient https downloader? - madscijr - 01-03-2025 (01-03-2025, 05:45 PM)Pete Wrote: I remember talking out of my ass once, many years ago... Haven't been able to navigate back, since.Very interesting... thanks! |