This is one for the network/server/hosting experts I think - but do have your say
I am still tossing ideas around in my mind about the murga-linux server outage, and its effect on the Puppy community
To my mind we haven't really discussed a strategy for the future, so...
My idea, is based on Distributed Servers/Filestores/Repos, having the same or nearly the same data on
Now, thinking about puppy package manager...and there may be room for improvement here....the user has to choose which repo to access, but, there is a script which runs, and tests the server and file availability, then downloads it
My idea is to have something similar, but automated, hence possible PPM improvement
I envisage a server list distributed amongst servers, with dynamic file monitoring, so each server knows which other sites have the file requested, i.e. distributed database management - a sort of dynamic distributed failover, which may have dynamic DNS or IP re-director to the data, so any user requesting a url link to any one site for data, would be redirected to whichever server is nearest/fastest or has the requested file, OR, a combination of servers would download multi-part bits of the file i.e. If download speed or wget or multi-path type download was included all the better
Can this be done...it's only code after all, so why should we restrict ourselves to reliance on a single source?
I believe this could be used for forums such as this, also, using rsync to ensure dynamic data balancing, so perhaps John M might consider it, also?
I don't know if this is only applicable to servers, or if the vast number of Puppy user's PCs could become virtual hosts by running a script or virtual server in the background?
Note: This is a PUPPY ISO/software access/safeguard idea, as I believe we have the innovation to be the first small linux distribution userbase to discuss this
Ideas/suggestions/shoot-me-down-in-flames...? Yes, costs/fundraising may need discussing, so marketing ideas can be added