After discussing this with Getiria, I think what is needed here is a way to use FTP to upload compressed histories from the import screen as is available from the upload data tool. The problem we're currently experiencing is that when we try to upload a compressed history where the compressed file is greater than 2GB, the process fails. This is preventing us from being able to effectively have users archive data from Galaxy to reduce our disk usage. This is a very pressing issue for us with our production Galaxy server at the Minnesota Supercomputing Institute.
Additionally, if someone could provide a better understanding of how the exported histories are treated on the Galaxy server that would be helpful. e.g. how long do the compressed histories stick around on disk, etc? We would like to have users archive their data on their own systems and permanently delete their Galaxy history when an analysis is complete, but if the compressed history persists on the Galaxy server the utility of this is somewhat reduced.
Thanks!