On Mon, Jul 21, 2014 at 6:51 PM, Eric Rasche <rasche.eric@yandex.ru> wrote:
Currently the checkout options consist of hg clones, and archives that mercurial produces.
Having pulled or cloned galaxy a few times lately, I'm wondering if anyone would have a use for a once-run galaxy instance in an archive? I.e., I'd clone, run once to grab eggs and do the db migration, then re-tar result and store online. Might cut down on build/test times for those who are using travis or other CIs. Thoughts/opinions?
Hi Eric, Given how close you can get now for minimal effort, this seem unnecessary. http://blastedbio.blogspot.co.uk/2013/09/using-travis-ci-for-testing-galaxy-... My TravisCI setup this fetches the latest Galaxy as a tar ball (from a GitHub mirror as it was faster than a git clone which was faster than getting the tar ball from BitBucket, which in turn was faster than using hg clone), and a per-migrated SQLite database (using a bit of Galaxy functionality originally with $GALAXY_TEST_DB_TEMPLATE added to speed up running the functional tests). Note this does not cache the eggs and all the other side effects of the first run like creating config files, so there is room for some speed up. Regards, Peter