Hi Erick,
The process we've been using on our Galaxy instances has been to set up cron
to execute the shell scripts on a daily basis, something like this:
# clean up datasets
0 7 * * * cd /var/opt/home/g2test/server-home/test ; sh
./scripts/cleanup_datasets/delete_userless_histories.sh ; sh
./scripts/cleanup_datasets/purge_histories.sh ; sh
./scripts/cleanup_datasets/purge_datasets.sh
This works quite well, and the logs files for each of these processes can be
viewed to see what occurred.
We'll definitely consider adding a link to this script in the Admin view as
well.
Greg Von Kuster
Galaxy Development Team
Erick Antezana wrote:
Hi Greg,
thanks for the heads-up.
the scripts work fine; however, it would be great to have such
functionality from within the Admin interface (Data section?). Are there any
plans to implement such request?
cheers,
Erick
Hi Erick,
Never do this manually as it will result in problems in your
Galaxy instance. Use the
~/scripts/cleanup_datasets/cleanup_datasets.py script in the
Galaxy distribution. Refer to the
deatils about executing the script.
The wiki is a bit outdated - we have a bitbucket ticket open ( #
99 -
) to correct this.
Greg Von Kuster
Galaxy Development Team
Erick Antezana wrote:
Hi,
how can I purge my deleted datasets that were imported so
that I could clean a bit my disk? I saw they are under
database/files/000/*
thanks,
Erick
------------------------------------------------------------------------
_______________________________________________
galaxy-user mailing list