Greg;
[large data library performance]
Security checks are performed on every active ( undeleted ) dataset
within a data library as it's contents are rendered upon opening. For
the current user, every dataset is checked to determine if the user
can access it, and if so, then checks are made to see if the user has
permission to perform operations on the dataset that fall into the add
/ modify / manage permissions areas. These checks incur db hits for
each dataset, so if your data library include many datasets ( several
hundred or more ), then it will take a bit of time to render it upon
opening. The size of the dataset files is not an issue here, but the
number of datasets within the data library.
We are running into this limit quite a bit in practice as our data libraries
grow. Splitting it does provide as a quick workaround. What would you
think about loading folder data on-demand via Ajax? Our data is
stored in folders and sub-folders within the library, so this would
let us scale up library items without having to arbitrarily split
the data libraries.
I took a quick look at the code with this in might and it looked,
well, hard. But that could be totally due to my ignorance of the
implementation, what do you think?
Brad
___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client. To manage your subscriptions to this
and other Galaxy lists, please use the interface at:
http://lists.bx.psu.edu/