03-17-2025, 02:56 AM
(This post was last modified: 03-17-2025, 02:57 AM by PhilOfPerth.)
(03-17-2025, 01:56 AM)SMcNeill Wrote: Most of that time is your system doing system stuff with the downloads. To explain the length of time, for most folks the process is basically like this:
Download 2kb file. Takes as long as you'd expect it to take to download a 2k file. Negligible time involved.
Start anti-virus.
Scan file.
Declare it safe.
Allow file to be wrote to disk.
Close anti-virus.
Download next 1kb file.
Start anti-virus.
Scan file.
Declare it safe.
Allow file to be wrote to disk.
Close anti-virus.
Repeat and rinse for several thousand files...
Can you guess where the biggest amount of time is being spent when downloading all the contents for the wiki?
What might be nice, would be if we could sort out some way to have the wiki be zipped up into a single archive. You could download a 2mb zip file in no time and extract it. I'm just certain what process we'd have to sort out to automate those changes, as the wiki isn't directly connected to the github and such. One can be altered and changed, without affecting the other, so it'd be a little trick to sort out the syncing issues between the two, I think.
Could the github download all of the Wiki pages, maybe daily, and zip them for an update package to be downloaded when requested by members?
Of all the places on Earth, and all the planets in the Universe, I'd rather live here (Perth, W.A.) 
Please visit my Website at: http://oldendayskids.blogspot.com/

Please visit my Website at: http://oldendayskids.blogspot.com/