Hi all,


I have the problem that there are a lot of files that are not deleted in the database/files dir.

Among these files I have failed uploads and duplicated library files. The latter do not correspond to new versions of existing datasets.


I have run the cleanup_datasets.py script but I still have files dating back a year that are in this condition.


I have no clue as why/how this could have happened. 


The galaxy instance is installed on a cluster with the filesystem on a NAS. 

I have been running 16.01 on this instance (will update as soon as I can stop the service).



Could anyone suggest a way to identify those files that have to be in database/files so that I can delete all the rest??


Thanks in advance!

Cristian