Hi again,
So, we just found out that one of our automatically launched scripts had a bug and was
creating library datasets over and over again. Ending up in 16000+ entries of the same
library dataset before it started to fail.
That's not at all a solution to the problem but brings me to another question :)
Because we cannot use the web interface anymore to delete them, is there a way to remove
all these library datasets from the database?
Would it work to flag them deleted and use the purge scripts to locate and remove the
associated files? And after that, which database tables do we have to clean in order to
remove the library datasets completely from the database?
Thanks a lot,
Sajoscha
On Jan 29, 2013, at 3:47 PM, Sajoscha Sauer wrote:
Dear all,
We have a really annoying problem with one of our Libraries: we can't access it
anymore. We think the problem might be that the library contains too many folders/files
(the error message is pasted at the end of the mail).
Is there any limit for library content ? Note that we can still browse smaller libraries,
so I though that we should simply split the library into smaller ones but since we can t
even browse it...Any suggestions how to approach this problem?
We are running a galaxy instance used by different research groups and our current setup
is the following:
- we have one Library per research group
- we create a folder per project, under the relevant library, and add all files in these
folders. Note that we add symbolic links to the real file locations (this way the large
fastq files remain on owners file servers)
- the files are transferred automatically to Galaxy using custom code that makes use of
the galaxy API.
And of course, this automated transfer now crashes as our custom code tries to read
library content thru API call (to ask the user in which folder the new files should be
added)
PS: browsing libraries in general is really slow and gives users a bad 'user
experience' (not mentioning all the extra clicks they issue because they think Galaxy
did not get the request...), I am not sure why this action is that slow ( browsing small
size library is also slow). I think I saw emails about this issue but I am not sure what
the status is? Is there anything we could do in the code to speed this up? I am not sure
this problem is related to the current post , so sorry for mixing things up.
Thanks for your help!
Cheers
Sajoscha
=== error on galaxy page (nothing in the galaxy log file)
<error.tiff>
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
http://lists.bx.psu.edu/