On Dec 31, 2012, at 8:18 PM, Kyle Ellrott wrote:
I'm currently adding a large number of files into my Galaxy instance's dataset library. During the import some of the files (a small percentage) failed with:
/inside/depot4/galaxy/set_metadata.sh: line 4: 14790 Segmentation fault (core dumped) python ./scripts/set_metadata.py $@
I think it's probably standard cluster shenanigans, and may work just fine if run again. But there doesn't seem to be a way retry. Is there a way to deal with this that is easier than manually deleting and re-uploading the offending files?
Hi Kyle, Unfortunately, there's not going to be a way to do this entirely in the UI. Your best shot is to change the state of the datasets in the database from 'error' to 'ok' and then try using the metadata auto-detect button in the UI. --nate
Kyle ___________________________________________________________ Please keep all replies on the list by using "reply all" in your mail client. To manage your subscriptions to this and other Galaxy lists, please use the interface at: