is there size limit of dataset for running Tophat?
Hi All, Is there a size limit of dataset for running Tophat at Galaxy? If there is, how many reads is the limit? Thanks. Jianguang
Hi Jianguang, The limit for Tophat will most likely not be the number of reads, but the total processing time when using the public Galaxy instance. Currently, a job has 72 hours to complete, assuming that there is not a memory problem before that time limit is reached. But, there are some size limitations. Initial upload file size must be 50G or less. Output files must be 200G or less, and there must be room on your history for the output, or further work will not be possible until the account is brought back under quota (250G). These wikis contain the same information and more: http://wiki.galaxyproject.org/Main -> links to -> http://wiki.galaxyproject.org/Learn/Managing%20Datasets#Data_size_and_disk_Q... For large or batch processing, the cloud option is the best recommendation, since this allows you to customize resource as needed. http://usegalaxy.org/cloud Thanks! Jen Galaxy team On 3/27/13 10:38 AM, Du, Jianguang wrote:
Hi All,
Is there a size limit of dataset for running Tophat at Galaxy? If there is, how many reads is the limit?
Thanks.
Jianguang
___________________________________________________________ The Galaxy User list should be used for the discussion of Galaxy analysis and other features on the public server at usegalaxy.org. Please keep all replies on the list by using "reply all" in your mail client. For discussion of local Galaxy instances and the Galaxy source code, please use the Galaxy Development list:
http://lists.bx.psu.edu/listinfo/galaxy-dev
To manage your subscriptions to this and other Galaxy lists, please use the interface at:
To search Galaxy mailing lists use the unified search at:
-- Jennifer Hillman-Jackson Galaxy Support and Training http://galaxyproject.org
participants (2)
-
Du, Jianguang
-
Jennifer Jackson