Hi Derrick, If you're using Postgres, it's now possible to clean these up using galaxy-dist/scripts/cleanup_datasets/pgcleanup.py --nate On Sep 13, 2012, at 11:55 PM, Derrick Lin wrote:
Thanks Jeremy,
I am sure it's trivial for us to do a manual clean up
Cheers, D
On Fri, Sep 14, 2012 at 1:05 PM, Jeremy Goecks <jeremy.goecks@emory.edu> wrote: Cleaning up the datasets manually is the best suggestion for now. We're planning to enhance the clean up scripts to automatically delete history export files soon.
Best, J.
On Sep 13, 2012, at 10:37 PM, Derrick Lin wrote:
hi guys,
I have been trying to clean up some old histories and datasets on our local galaxy. I have spotted one very old dataset file with big size didn't get removed no matter what.
I went to check the reports webapp, that dataset isn't on the largest unpurged data files list (despite it has bigger size).
Then I moved on to checking the database, no history or library associates with that dataset. Finally, I found its trace in job_to_input_dataset, so I could identified the job id and found that it's the file created by history Export to File.
I also find relevant entry in job_export_history_archive. Since such datasets are not associated with any history, library, the documented dataset cleanup method does not work on them.
Any suggestion?
Regards, Derrick ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at:
___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: