Hi,
building a nice gallery with a few high rez pics, i've almost instantly encountered a few problems with disk usage and then thought of the following idea..
Basically it would use the current upload system that uses web URLs to the pictures to "upload", download and use the source pictures to produce thumbnails and intermediate pictures but then delete the downloaded source picture and only store the URL provided, as the picture location for the large version of that picture..
The navigation and all would behave just as if the picture was actually on the server, using local thumbnails and intermediate pictures but when the intermediate picture is clicked it would open the link to the source picture using the stored web URL, provided when doing the upload..
That way only "small" size pictures are actually stored on the server running the gallery and you don't really have to use a huge disk space to have a very large gallery..
A detail would be to define what to do when the source picture is not bigger than the intermediate picture settings, guess it would be okay to download the remote and keep it locally..
I believe it would not be a very hard thing to do as almost nothing but the upload script would have to be modified and the only other script requirering modifications would be the scprit used to display large versions of the picture..
So the gallery could be used to centralise and archive pictures actually stored on various servers/locations, including local storage of course..
hope it hasn't been requested too many times before
.flux