On Fri, Oct 28, 2011 at 5:50 PM, Cittaro Davide <cittaro.davide@hsr.it> wrote:

On Oct 28, 2011, at 5:35 PM, James Taylor wrote:

It currently uses a tiny amount of S3 storage just to save configuration
information about your instance.


Ok.. never used AWS, actually, I didn't know S3 holds the information. I guess I will have to read some how-to

This configuration is something cloudman does behind the scenes so nothing there to worry about much.


Long term though we plan to move dataset storage over to S3 as well.

Mmm... I've just had a chat with an AWS engineer, he told me that every operation on S3-stored data goes through a download/upload process... isn't that a PITA for data analysis? I guess S3 is for "static"  data 

EBS has limits, S3 is more durable and scalable.

Which limits (in addition to the 1 Tb size)? I know these are not Galaxy-related questions but you are the best people I can ask :-)

The 1TB size is the primary issue, especially with NGS data. That's why we're looking into S3 as a way to offload some of the data size issues while handling it all behind the scenes. Other than that, the only other comment is that these instances are independent and self standing so if any customizations are required, manual effort will be required (but this applies to any local instance).

Let us know if you have any more questions,
Enis
 

d

/*
Davide Cittaro, PhD

Head of Bioinformatics Core
Center for Translational Genomics and Bioinformatics
San Raffaele Scientific Institute
Via Olgettina 58
20132 Milano
Italy

Office: +39 02 26439140
Skype: daweonline
*/











___________________________________________________________
Please keep all replies on the list by using "reply all"
in your mail client.  To manage your subscriptions to this
and other Galaxy lists, please use the interface at:

 http://lists.bx.psu.edu/