I havent played around with the limits too much. I have it up and running on a 2xquad core xserve with 16GB RAM and a fet TB of disk space. But the machine is also used heavily for other tasks. Almost all of your resources (disk, RAM, cpu) will be consumed by the tools you run. The galaxy interface is mostly lightweight and apache is really efficient. So it will depend on how many users you have and what kind of data and which analyses. There is a lot of disk access involved in this kind of data processing, and disks are slow(relative to everything else) so you should probably have a RAID where each file is spread across multiple disks. That will speed up read/writes. Also, get more CPUs/cores rather than faster ones, if you have the choice and a limited budget. In terms of RAM, most tools operate on text tables and many of the ones that process raw data do so serially (without putting the whole file in RAM), so you won't need obscene amounts. Maybe 2GB for the OS and servers, and 1GB for each active user would be OK. Good luck -j On Nov 3, 2010, at 8:58 AM, Andreu Alibés <aalibes@gmail.com> wrote:
Hi,
This one is for the people that have a Galaxy instance running on their institutes/companies, but probably not for the people running the main galaxy website: What would you say the requirements of Galaxy are in terms of disk space, cpu/memory, etc..?
Thanks,
Andreu
-- -------------------------------------------- Andreu Alibés, PhD Bioinformatics Core & EMBL-CRG Systems Biology Unit Center for Genomic Regulation C/ Dr. Aiguader 88, 08003 Barcelona, Spain Phone: +34 93 316 0202 http://sites.google.com/site/aalibes
_______________________________________________ galaxy-dev mailing list galaxy-dev@lists.bx.psu.edu http://lists.bx.psu.edu/listinfo/galaxy-dev