downloading huge data sets from history using wget
Dear Galaxy team, I am doing some standard clipping and trimming using galaxy and would be happy to download the generated files using a unix terminal and wget. Is it a way to figure it out the exact link of the data file to wget it easily? I am talking about file sizes over 5 Gb that are really time consuming and error prone to download using a web browser. Thank you very much for developing this excellent tool Peter
Hi Peter, It's pretty easy. In the Galaxy interface, use the "save" icon represented as a floppy disk. Instead of clicking on the icon, get the URL for it (in Firefox: right click > save link location). Then simply copy this URL in your terminal for wget to use. Florent On 30/05/10 07:12, pis@duke.edu wrote:
Dear Galaxy team,
I am doing some standard clipping and trimming using galaxy and would be happy to download the generated files using a unix terminal and wget. Is it a way to figure it out the exact link of the data file to wget it easily? I am talking about file sizes over 5 Gb that are really time consuming and error prone to download using a web browser.
Thank you very much for developing this excellent tool Peter
_______________________________________________ galaxy-user mailing list galaxy-user@lists.bx.psu.edu http://lists.bx.psu.edu/listinfo/galaxy-user
participants (2)
-
Florent Angly
-
pis@duke.edu