Problem with dataset not being cleaned.
17 May
2017
17 May
'17
4:05 p.m.
Hi all, I have the problem that there are a lot of files that are not deleted in the database/files dir. Among these files I have failed uploads and duplicated library files. The latter do not correspond to new versions of existing datasets. I have run the cleanup_datasets.py script but I still have files dating back a year that are in this condition. I have no clue as why/how this could have happened. The galaxy instance is installed on a cluster with the filesystem on a NAS. I have been running 16.01 on this instance (will update as soon as I can stop the service). Could anyone suggest a way to identify those files that have to be in database/files so that I can delete all the rest?? Thanks in advance! Cristian
2741
Age (days ago)
2741
Last active (days ago)
0 comments
1 participants
participants (1)
-
C. Ch.