galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
October 2014
- 2 participants
- 174 discussions
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/47171159e9c0/
Changeset: 47171159e9c0
User: jmchilton
Date: 2014-10-07 14:23:41+00:00
Summary: Python 3 fix from planemo.
Affected #: 1 file
diff -r 9cad38925c8da6221a1ce9817d382421c6627ecd -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead lib/galaxy/util/submodules.py
--- a/lib/galaxy/util/submodules.py
+++ b/lib/galaxy/util/submodules.py
@@ -14,7 +14,7 @@
__import__(full_submodule)
submodule = getattr(module, submodule_name)
submodules.append(submodule)
- except BaseException, exception:
+ except BaseException as exception:
exception_str = str(exception)
message = "%s dynamic module could not be loaded: %s" % (full_submodule, exception_str)
log.debug(message)
https://bitbucket.org/galaxy/galaxy-central/commits/6b7782f17e84/
Changeset: 6b7782f17e84
User: jmchilton
Date: 2014-10-07 14:23:41+00:00
Summary: Bring in newer experimental components from downstream work on planemo.
This includes some utilities for working with Dockerfiles requested by Kyle Ellrott, a linting framework for Galaxy tools and a bunch of work on brew integration - most specifically a brew dependency resolver. It is obvious why the brew dependnecy resolver needs to be included in galaxy core - but I am also including the linting stuff here in case we want to reuse it in the tool shed or in the tool form. Sam's new tool form, plus the automatic tool form reloading by me and Kyle, and the tool package downloader by Dave B - is an exciting confluence of features that should really speed up tool development - adding in an GUI tool linting report would pair nicely with these features. (GUI + API work not included here - just the outline of the tool linter).
The homebrew work is tracking progress on building isolated, versioned homebrew environments here - https://github.com/jmchilton/brew-tests.
Allows deployers to add homebrew elements to config/tool_dependency_resolvers_conf.xml. Optional attributes include 'cellar' - this should be the absolute path to the homebrew Cellar to target (defaults to $HOME/.linuxbrew/Cellar under linux and to /usr/local/Cellar under that other operating system Galaxy supports).
'versionless' is another attribute supported by this tag - if set to true - it will ignore the specified package version and just resolve the latest installed homebrew version of that recipe. (If used this should always come after a resolver that respects versions.)
This should work in some superficial way for any brew installed recipe - but it is much more robust and useful for packages installed with the `brew vinstall` external command (found in https://github.com/jmchilton/brew-tests) For `install`ed packages - each dependency must be laided out as a requirement in Galaxy - so samtools 1.1 would require to second requirement tag for hstlib for instance. For `vinstall`ed packages the dependencies are recorded at install time and can be reproducibily recovered at runtime without requiring modifying the state of the Cellar or even needing brew installed at runtime on the worker (only the Cellar directory needs to be avaialable).
Affected #: 15 files
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/deps/brew_exts.py
--- /dev/null
+++ b/lib/galaxy/tools/deps/brew_exts.py
@@ -0,0 +1,470 @@
+#!/usr/bin/env python
+
+# % brew vinstall samtools 1.0
+# % brew vinstall samtools 0.1.19
+# % brew vinstall samtools 1.1
+# % brew env samtools 1.1
+# PATH=/home/john/.linuxbrew/Cellar/htslib/1.1/bin:/home/john/.linuxbrew/Cellar/samtools/1.1/bin:$PATH
+# export PATH
+# LD_LIBRARY_PATH=/home/john/.linuxbrew/Cellar/htslib/1.1/lib:/home/john/.linuxbrew/Cellar/samtools/1.1/lib:$LD_LIBRARY_PATH
+# export LD_LIBRARY_PATH
+# % . <(brew env samtools 1.1)
+# % which samtools
+# /home/john/.linuxbrew/Cellar/samtools/1.1/bin/samtools
+# % . <(brew env samtools 0.1.19)
+# % which samtools
+# /home/john/.linuxbrew/Cellar/samtools/0.1.19/bin/samtools
+# % brew vuninstall samtools 1.0
+# % brew vdeps samtools 1.1
+# htslib(a)1.1
+# % brew vdeps samtools 0.1.19
+
+from __future__ import print_function
+
+import argparse
+import contextlib
+import json
+import glob
+import os
+import re
+import sys
+import subprocess
+
+WHITESPACE_PATTERN = re.compile("[\s]+")
+
+DESCRIPTION = "Script built on top of linuxbrew to operate on isolated, versioned brew installed environments."
+
+if sys.platform == "darwin":
+ DEFAULT_HOMEBREW_ROOT = "/usr/local"
+else:
+ DEFAULT_HOMEBREW_ROOT = os.path.join(os.path.expanduser("~"), ".linuxbrew")
+
+NO_BREW_ERROR_MESSAGE = "Could not find brew on PATH, please place on path or pass to script with --brew argument."
+CANNOT_DETERMINE_TAP_ERROR_MESSAGE = "Cannot determine tap of specified recipe - please use fully qualified recipe (e.g. homebrew/science/samtools)."
+VERBOSE = False
+RELAXED = False
+
+
+class BrewContext(object):
+
+ def __init__(self, args=None):
+ ensure_brew_on_path(args)
+ raw_config = brew_execute(["config"])
+ config_lines = [l.strip().split(":", 1) for l in raw_config.split("\n") if l]
+ config = dict([(p[0].strip(), p[1].strip()) for p in config_lines])
+ # unset if "/usr/local" -> https://github.com/Homebrew/homebrew/blob/master/Library/Homebrew/cmd/confi…
+ homebrew_prefix = config.get("HOMEBREW_PREFIX", "/usr/local")
+ homebrew_cellar = config.get("HOMEBREW_CELLAR", os.path.join(homebrew_prefix, "Cellar"))
+ self.homebrew_prefix = homebrew_prefix
+ self.homebrew_cellar = homebrew_cellar
+
+
+class RecipeContext(object):
+
+ @staticmethod
+ def from_args(args, brew_context=None):
+ return RecipeContext(args.recipe, args.version, brew_context)
+
+ def __init__(self, recipe, version, brew_context=None):
+ self.recipe = recipe
+ self.version = version
+ self.brew_context = brew_context or BrewContext()
+
+ @property
+ def cellar_path(self):
+ return recipe_cellar_path(self.brew_context.homebrew_cellar, self.recipe, self.version)
+
+ @property
+ def tap_path(self):
+ return os.path.join(self.brew_context.homebrew_prefix, "Library", "Taps", self.__tap_path(self.recipe))
+
+ def __tap_path(self, recipe):
+ parts = recipe.split("/")
+ if len(parts) == 1:
+ info = brew_info(self.recipe)
+ from_url = info["from_url"]
+ if not from_url:
+ raise Exception(CANNOT_DETERMINE_TAP_ERROR_MESSAGE)
+ from_url_parts = from_url.split("/")
+ blob_index = from_url_parts.index("blob") # comes right after username and repository
+ if blob_index < 2:
+ raise Exception(CANNOT_DETERMINE_TAP_ERROR_MESSAGE)
+ username = from_url_parts[blob_index - 2]
+ repository = from_url_parts[blob_index - 1]
+ else:
+ assert len(parts) == 3
+ parts = recipe.split("/")
+ username = parts[0]
+ repository = "homebrew-%s" % parts[1]
+
+ path = os.path.join(username, repository)
+ return path
+
+
+def main():
+ global VERBOSE
+ global RELAXED
+ parser = argparse.ArgumentParser(description=DESCRIPTION)
+ parser.add_argument("--brew", help="Path to linuxbrew 'brew' executable to target")
+ actions = ["vinstall", "vuninstall", "vdeps", "vinfo", "env"]
+ action = __action(sys)
+ if not action:
+ parser.add_argument('action', metavar='action', help="Versioned action to perform.", choices=actions)
+ parser.add_argument('recipe', metavar='recipe', help="Recipe for action - should be absolute (e.g. homebrew/science/samtools).")
+ parser.add_argument('version', metavar='version', help="Version for action (e.g. 0.1.19).")
+ parser.add_argument('--relaxed', action='store_true', help="Relaxed processing - for instance allow use of env on non-vinstall-ed recipes.")
+ parser.add_argument('--verbose', action='store_true', help="Verbose output")
+ args = parser.parse_args()
+ if args.verbose:
+ VERBOSE = True
+ if args.relaxed:
+ RELAXED = True
+ if not action:
+ action = args.action
+ brew_context = BrewContext(args)
+ recipe_context = RecipeContext.from_args(args, brew_context)
+ if action == "vinstall":
+ versioned_install(recipe_context, args.recipe, args.version)
+ elif action == "vuninstall":
+ brew_execute(["switch", args.recipe, args.version])
+ brew_execute(["uninstall", args.recipe])
+ elif action == "vdeps":
+ print_versioned_deps(recipe_context, args.recipe, args.version)
+ elif action == "env":
+ env_statements = build_env_statements_from_recipe_context(recipe_context)
+ print(env_statements)
+ elif action == "vinfo":
+ with brew_head_at_version(recipe_context, args.recipe, args.version):
+ print(brew_info(args.recipe))
+ else:
+ raise NotImplementedError()
+
+
+class CommandLineException(Exception):
+
+ def __init__(self, command, stdout, stderr):
+ self.command = command
+ self.stdout = stdout
+ self.stderr = stderr
+ self.message = ("Failed to execute command-line %s, stderr was:\n"
+ "-------->>begin stderr<<--------\n"
+ "%s\n"
+ "-------->>end stderr<<--------\n"
+ "-------->>begin stdout<<--------\n"
+ "%s\n"
+ "-------->>end stdout<<--------\n"
+ ) % (command, stderr, stdout)
+
+ def __str__(self):
+ return self.message
+
+
+def versioned_install(recipe_context, package=None, version=None):
+ if package is None:
+ package = recipe_context.recipe
+ version = recipe_context.version
+
+ attempt_unlink(package)
+ with brew_head_at_version(recipe_context, package, version):
+ deps = brew_deps(package)
+ deps_metadata = []
+ dep_to_version = {}
+ for dep in deps:
+ version_info = brew_versions_info(dep, recipe_context.tap_path)[0]
+ dep_version = version_info[0]
+ dep_to_version[dep] = dep_version
+ versioned = version_info[2]
+ if versioned:
+ dep_to_version[dep] = dep_version
+ versioned_install(recipe_context, dep, dep_version)
+ else:
+ # Install latest.
+ dep_to_version[dep] = None
+ unversioned_install(dep)
+ try:
+ for dep in deps:
+ dep_version = dep_to_version[dep]
+ if dep_version:
+ brew_execute(["switch", dep, dep_version])
+ else:
+ brew_execute(["link", dep])
+ # dep_version obtained from brew versions doesn't
+ # include revision. This linked_keg attribute does.
+ keg_verion = brew_info(dep)["linked_keg"]
+ dep_metadata = {
+ 'name': dep,
+ 'version': keg_verion,
+ 'versioned': versioned
+ }
+ deps_metadata.append(dep_metadata)
+
+ brew_execute(["install", package])
+ deps = brew_execute(["deps", package])
+ deps = [d.strip() for d in deps.split("\n") if d]
+ metadata = {
+ 'deps': deps_metadata
+ }
+ cellar_root = recipe_context.brew_context.homebrew_cellar
+ cellar_path = recipe_cellar_path( cellar_root, package, version )
+ v_metadata_path = os.path.join(cellar_path, "INSTALL_RECEIPT_VERSIONED.json")
+ with open(v_metadata_path, "w") as f:
+ json.dump(metadata, f)
+
+ finally:
+ attempt_unlink_all(package, deps)
+
+
+def commit_for_version(recipe_context, package, version):
+ tap_path = recipe_context.tap_path
+ commit = None
+ with brew_head_at_commit("master", tap_path):
+ version_to_commit = brew_versions_info(package, tap_path)
+ if version is None:
+ version = version_to_commit[0][0]
+ commit = version_to_commit[0][1]
+ else:
+ for mapping in version_to_commit:
+ if mapping[0] == version:
+ commit = mapping[1]
+ if commit is None:
+ raise Exception("Failed to find commit for version %s" % version)
+ return commit
+
+
+def print_versioned_deps(recipe_context, recipe, version):
+ deps = load_versioned_deps(recipe_context.cellar_path)
+ for dep in deps:
+ val = dep['name']
+ if dep['versioned']:
+ val += "@%s" % dep['version']
+ print(val)
+
+
+def load_versioned_deps(cellar_path, relaxed=None):
+ if relaxed is None:
+ relaxed = RELAXED
+ v_metadata_path = os.path.join(cellar_path, "INSTALL_RECEIPT_VERSIONED.json")
+ if not os.path.isfile(v_metadata_path):
+ if RELAXED:
+ return []
+ else:
+ raise IOError("Could not locate versioned receipt file: {}".format(v_metadata_path))
+ with open(v_metadata_path, "r") as f:
+ metadata = json.load(f)
+ return metadata['deps']
+
+
+def unversioned_install(package):
+ try:
+ deps = brew_deps(package)
+ for dep in deps:
+ brew_execute(["link", dep])
+ brew_execute(["install", package])
+ finally:
+ attempt_unlink_all(package, deps)
+
+
+def attempt_unlink_all(package, deps):
+ for dep in deps:
+ attempt_unlink(dep)
+ attempt_unlink(package)
+
+
+def attempt_unlink(package):
+ try:
+ brew_execute(["unlink", package])
+ except Exception:
+ # TODO: warn
+ pass
+
+
+def brew_execute(args):
+ os.environ["HOMEBREW_NO_EMOJI"] = "1" # simplify brew parsing.
+ cmds = ["brew"] + args
+ return execute(cmds)
+
+
+def build_env_statements_from_recipe_context(recipe_context, **kwds):
+ cellar_root = recipe_context.brew_context.homebrew_cellar
+ env_statements = build_env_statements(cellar_root, recipe_context.cellar_path, **kwds)
+ return env_statements
+
+
+def build_env_statements(cellar_root, cellar_path, relaxed=None):
+ deps = load_versioned_deps(cellar_path, relaxed=relaxed)
+
+ path_appends = []
+ ld_path_appends = []
+
+ def handle_keg(cellar_path):
+ bin_path = os.path.join(cellar_path, "bin")
+ if os.path.isdir(bin_path):
+ path_appends.append(bin_path)
+ lib_path = os.path.join(cellar_path, "lib")
+ if os.path.isdir(lib_path):
+ ld_path_appends.append(lib_path)
+
+ for dep in deps:
+ package = dep['name']
+ version = dep['version']
+ dep_cellar_path = recipe_cellar_path( cellar_root, package, version )
+ handle_keg( dep_cellar_path )
+
+ handle_keg( cellar_path )
+ env_statements = []
+ if path_appends:
+ env_statements.append("PATH=" + ":".join(path_appends) + ":$PATH")
+ env_statements.append("export PATH")
+ if ld_path_appends:
+ env_statements.append("LD_LIBRARY_PATH=" + ":".join(ld_path_appends) + ":$LD_LIBRARY_PATH")
+ env_statements.append("export LD_LIBRARY_PATH")
+ return "\n".join(env_statements)
+
+
+(a)contextlib.contextmanager
+def brew_head_at_version(recipe_context, package, version):
+ commit = commit_for_version(recipe_context, package, version)
+ tap_path = recipe_context.tap_path
+ with brew_head_at_commit(commit, tap_path):
+ yield
+
+
+(a)contextlib.contextmanager
+def brew_head_at_commit(commit, tap_path):
+ try:
+ os.chdir(tap_path)
+ current_commit = git_execute(["rev-parse", "HEAD"]).strip()
+ try:
+ git_execute(["checkout", commit])
+ yield
+ finally:
+ git_execute(["checkout", current_commit])
+ finally:
+ # TODO: restore chdir - or better yet just don't chdir
+ # shouldn't be needed.
+ pass
+
+
+def git_execute(args):
+ cmds = ["git"] + args
+ return execute(cmds)
+
+
+def execute(cmds):
+ p = subprocess.Popen(cmds, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ #log = p.stdout.read()
+ global VERBOSE
+ stdout, stderr = p.communicate()
+ if p.returncode != 0:
+ raise CommandLineException(" ".join(cmds), stdout, stderr)
+ if VERBOSE:
+ print(stdout)
+ return stdout
+
+
+def brew_deps(package):
+ stdout = brew_execute(["deps", package])
+ return [p.strip() for p in stdout.split("\n") if p]
+
+
+def brew_info(recipe):
+ info_json = brew_execute(["info", "--json=v1", recipe])
+ info = json.loads(info_json)[0]
+ info.update(extended_brew_info(recipe))
+ return info
+
+
+def extended_brew_info(recipe):
+ # Extract more info from non-json variant. JSON variant should
+ # include this in a backward compatible way (TODO: Open PR).
+ raw_info = brew_execute(["info", recipe])
+ extra_info = dict(
+ from_url=None,
+ build_dependencies=[],
+ required_dependencies=[],
+ recommended_dependencies=[],
+ optional_dependencies=[],
+ )
+
+ for line in raw_info.split("\n"):
+ if line.startswith("From: "):
+ extra_info["from_url"] = line[len("From: "):].strip()
+ for dep_type in ["Build", "Required", "Recommended", "Optional"]:
+ if line.startswith("%s: " % dep_type):
+ key = "%s_dependencies" % dep_type.lower()
+ raw_val = line[len("%s: " % dep_type):]
+ extra_info[key].extend(raw_val.split(", "))
+ return extra_info
+
+
+def brew_versions_info(package, tap_path):
+
+ def versioned(recipe_path):
+ if not os.path.isabs(recipe_path):
+ recipe_path = os.path.join(os.getcwd(), recipe_path)
+ # Dependencies in the same repository should be versioned,
+ # core dependencies (presumably in base homebrew) are not
+ # versioned.
+ return tap_path in recipe_path
+
+ # TODO: Also use tags.
+ stdout = brew_execute(["versions", package])
+ version_parts = [l for l in stdout.split("\n") if l and "git checkout" in l]
+ version_parts = map(lambda l: WHITESPACE_PATTERN.split(l), version_parts)
+ info = [(p[0], p[3], versioned(p[4])) for p in version_parts]
+ return info
+
+
+def __action(sys):
+ script_name = os.path.basename(sys.argv[0])
+ if script_name.startswith("brew-"):
+ return script_name[len("brew-"):]
+ else:
+ return None
+
+
+def recipe_cellar_path(cellar_path, recipe, version):
+ recipe_base = recipe.split("/")[-1]
+ recipe_base_path = os.path.join(cellar_path, recipe_base, version)
+ revision_paths = glob.glob(recipe_base_path + "_*")
+ if revision_paths:
+ revisions = map(lambda x: int(x.rsplit("_", 1)[-1]), revision_paths)
+ max_revision = max(revisions)
+ recipe_path = "%s_%d" % (recipe_base_path, max_revision)
+ else:
+ recipe_path = recipe_base_path
+ return recipe_path
+
+
+def ensure_brew_on_path(args):
+ brew_on_path = which("brew")
+ if brew_on_path:
+ brew_on_path = os.path.abspath(brew_on_path)
+
+ def ensure_on_path(brew):
+ if brew != brew_on_path:
+ os.environ["PATH"] = "%s:%s" % (os.path.dirname(brew), os.environ["PATH"])
+
+ default_brew_path = os.path.join(DEFAULT_HOMEBREW_ROOT, "bin", "brew")
+ if args and args.brew:
+ user_brew_path = os.path.abspath(args.brew)
+ ensure_on_path(user_brew_path)
+ elif brew_on_path:
+ return brew_on_path
+ elif os.path.exists(default_brew_path):
+ ensure_on_path(default_brew_path)
+ else:
+ raise Exception(NO_BREW_ERROR_MESSAGE)
+
+
+def which(file):
+ # http://stackoverflow.com/questions/5226958/which-equivalent-function-in-pyt…
+ for path in os.environ["PATH"].split(":"):
+ if os.path.exists(path + "/" + file):
+ return path + "/" + file
+
+ return None
+
+
+if __name__ == "__main__":
+ main()
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/deps/brew_util.py
--- /dev/null
+++ b/lib/galaxy/tools/deps/brew_util.py
@@ -0,0 +1,40 @@
+""" brew_exts defines generic extensions to Homebrew this file
+builds on those abstraction and provides Galaxy specific functionality
+not useful to the brew external commands.
+"""
+from ..deps import brew_exts
+
+DEFAULT_TAP = "homebrew/science"
+
+
+class HomebrewRecipe(object):
+
+ def __init__(self, recipe, version, tap):
+ self.recipe = recipe
+ self.version = version
+ self.tap = tap
+
+
+def requirements_to_recipes(requirements):
+ return filter(None, map(requirement_to_recipe, requirements))
+
+
+def requirement_to_recipe(requirement):
+ if requirement.type != "package":
+ return None
+ # TOOD: Allow requirements to annotate optionalbrew specific
+ # adaptions.
+ recipe_name = requirement.name
+ recipe_version = requirement.version
+ return HomebrewRecipe(recipe_name, recipe_version, tap=DEFAULT_TAP)
+
+
+def requirements_to_recipe_contexts(requirements, brew_context):
+ def to_recipe_context(homebrew_recipe):
+ return brew_exts.RecipeContext(
+ homebrew_recipe.recipe,
+ homebrew_recipe.version,
+ brew_context
+ )
+ return map(to_recipe_context, requirements_to_recipes(requirements))
+
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/deps/commands.py
--- /dev/null
+++ b/lib/galaxy/tools/deps/commands.py
@@ -0,0 +1,54 @@
+import os
+import subprocess
+
+
+def shell(cmds, env=None):
+ popen_kwds = dict(
+ shell=True,
+ )
+ if env:
+ new_env = os.environ.copy()
+ new_env.update(env)
+ popen_kwds["env"] = new_env
+ p = subprocess.Popen(cmds, **popen_kwds)
+ return p.wait()
+
+
+def execute(cmds):
+ return __wait(cmds, shell=False)
+
+
+def which(file):
+ # http://stackoverflow.com/questions/5226958/which-equivalent-function-in-pyt…
+ for path in os.environ["PATH"].split(":"):
+ if os.path.exists(path + "/" + file):
+ return path + "/" + file
+
+ return None
+
+
+def __wait(cmds, **popen_kwds):
+ p = subprocess.Popen(cmds, **popen_kwds)
+ stdout, stderr = p.communicate()
+ if p.returncode != 0:
+ raise CommandLineException(" ".join(cmds), stdout, stderr)
+ return stdout
+
+
+class CommandLineException(Exception):
+
+ def __init__(self, command, stdout, stderr):
+ self.command = command
+ self.stdout = stdout
+ self.stderr = stderr
+ self.message = ("Failed to execute command-line %s, stderr was:\n"
+ "-------->>begin stderr<<--------\n"
+ "%s\n"
+ "-------->>end stderr<<--------\n"
+ "-------->>begin stdout<<--------\n"
+ "%s\n"
+ "-------->>end stdout<<--------\n"
+ ) % (command, stderr, stdout)
+
+ def __str__(self):
+ return self.message
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/deps/docker_util.py
--- a/lib/galaxy/tools/deps/docker_util.py
+++ b/lib/galaxy/tools/deps/docker_util.py
@@ -1,3 +1,4 @@
+import os
DEFAULT_DOCKER_COMMAND = "docker"
DEFAULT_SUDO = True
@@ -51,6 +52,34 @@
return ":".join([self.from_path, self.to_path, self.how])
+def build_command(
+ image,
+ docker_build_path,
+ docker_cmd=DEFAULT_DOCKER_COMMAND,
+ sudo=DEFAULT_SUDO,
+ sudo_cmd=DEFAULT_SUDO_COMMAND,
+ host=DEFAULT_HOST,
+):
+ if os.path.isfile(docker_build_path):
+ docker_build_path = os.path.dirname(os.path.abspath(docker_build_path))
+ build_command_parts = __docker_prefix(docker_cmd, sudo, sudo_cmd, host)
+ build_command_parts.extend(["build", "-t", image, docker_build_path])
+ return build_command_parts
+
+
+def build_save_image_command(
+ image,
+ destination,
+ docker_cmd=DEFAULT_DOCKER_COMMAND,
+ sudo=DEFAULT_SUDO,
+ sudo_cmd=DEFAULT_SUDO_COMMAND,
+ host=DEFAULT_HOST,
+):
+ build_command_parts = __docker_prefix(docker_cmd, sudo, sudo_cmd, host)
+ build_command_parts.extend(["save", "-o", destination, image])
+ return build_command_parts
+
+
def build_docker_cache_command(
image,
docker_cmd=DEFAULT_DOCKER_COMMAND,
@@ -72,6 +101,7 @@
def build_docker_run_command(
container_command,
image,
+ interactive=False,
tag=None,
volumes=[],
volumes_from=DEFAULT_VOLUMES_FROM,
@@ -88,6 +118,8 @@
):
command_parts = __docker_prefix(docker_cmd, sudo, sudo_cmd, host)
command_parts.append("run")
+ if interactive:
+ command_parts.append("-i")
for env_directive in env_directives:
command_parts.extend(["-e", env_directive])
for volume in volumes:
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/deps/dockerfiles.py
--- /dev/null
+++ b/lib/galaxy/tools/deps/dockerfiles.py
@@ -0,0 +1,54 @@
+import os
+
+from ..deps import commands
+from ..deps import docker_util
+from ..deps.requirements import parse_requirements_from_xml
+from ...tools import loader_directory
+
+import logging
+log = logging.getLogger(__name__)
+
+
+def docker_host_args(**kwds):
+ return dict(
+ docker_cmd=kwds["docker_cmd"],
+ sudo=kwds["docker_sudo"],
+ sudo_cmd=kwds["docker_sudo_cmd"],
+ host=kwds["docker_host"]
+ )
+
+
+def dockerfile_build(path, dockerfile=None, error=log.error, **kwds):
+ expected_container_names = set()
+ for (tool_path, tool_xml) in loader_directory.load_tool_elements_from_path(path):
+ requirements, containers = parse_requirements_from_xml(tool_xml)
+ for container in containers:
+ if container.type == "docker":
+ expected_container_names.add(container.identifier)
+ break
+
+ if len(expected_container_names) == 0:
+ error("Could not find any docker identifiers to generate.")
+
+ if len(expected_container_names) > 1:
+ error("Multiple different docker identifiers found for selected tools [%s]", expected_container_names)
+
+ image_identifier = expected_container_names.pop()
+ if dockerfile is None:
+ dockerfile = "Dockerfile"
+
+ docker_command_parts = docker_util.build_command(
+ image_identifier,
+ dockerfile,
+ **docker_host_args(**kwds)
+ )
+ commands.execute(docker_command_parts)
+ docker_image_cache = kwds['docker_image_cache']
+ if docker_image_cache:
+ destination = os.path.join(docker_image_cache, image_identifier + ".tar")
+ save_image_command_parts = docker_util.build_save_image_command(
+ image_identifier,
+ destination,
+ **docker_host_args(**kwds)
+ )
+ commands.execute(save_image_command_parts)
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/deps/resolvers/homebrew.py
--- /dev/null
+++ b/lib/galaxy/tools/deps/resolvers/homebrew.py
@@ -0,0 +1,96 @@
+"""
+This file implements a brew resolver for Galaxy requirements. In order for Galaxy
+to pick up on recursively defined and versioned brew dependencies recipes should
+be installed using the experimental `brew-vinstall` external command.
+
+More information here:
+
+https://github.com/jmchilton/brew-tests
+https://github.com/Homebrew/homebrew-science/issues/1191
+
+This is still an experimental module and there will almost certainly be backward
+incompatible changes coming.
+"""
+
+import os
+
+from ..brew_exts import DEFAULT_HOMEBREW_ROOT, recipe_cellar_path, build_env_statements
+from ..resolvers import DependencyResolver, INDETERMINATE_DEPENDENCY, Dependency
+
+# TODO: Implement prefer version linked...
+PREFER_VERSION_LINKED = 'linked'
+PREFER_VERSION_LATEST = 'latest'
+UNKNOWN_PREFER_VERSION_MESSAGE_TEMPLATE = "HomebrewDependencyResolver prefer_version must be latest %s"
+UNKNOWN_PREFER_VERSION_MESSAGE = UNKNOWN_PREFER_VERSION_MESSAGE_TEMPLATE % (PREFER_VERSION_LATEST)
+DEFAULT_PREFER_VERSION = PREFER_VERSION_LATEST
+
+
+class HomebrewDependencyResolver(DependencyResolver):
+ resolver_type = "homebrew"
+
+ def __init__(self, dependency_manager, **kwds):
+ self.versionless = _string_as_bool(kwds.get('versionless', 'false'))
+ self.prefer_version = kwds.get('prefer_version', None)
+
+ if self.prefer_version is None:
+ self.prefer_version = DEFAULT_PREFER_VERSION
+
+ if self.versionless and self.prefer_version not in [PREFER_VERSION_LATEST]:
+ raise Exception(UNKNOWN_PREFER_VERSION_MESSAGE)
+
+ cellar_root = kwds.get('cellar', None)
+ if cellar_root is None:
+ cellar_root = os.path.join(DEFAULT_HOMEBREW_ROOT, "Cellar")
+
+ self.cellar_root = cellar_root
+
+ def resolve(self, name, version, type, **kwds):
+ if type != "package":
+ return INDETERMINATE_DEPENDENCY
+
+ if version is None or self.versionless:
+ return self._find_dep_default(name, version)
+ else:
+ return self._find_dep_versioned(name, version)
+
+ def _find_dep_versioned(self, name, version):
+ recipe_path = recipe_cellar_path(self.cellar_root, name, version)
+ if not os.path.exists(recipe_path) or not os.path.isdir(recipe_path):
+ return INDETERMINATE_DEPENDENCY
+
+ commands = build_env_statements(self.cellar_root, recipe_path, relaxed=True)
+ return HomebrewDependency(commands)
+
+ def _find_dep_default(self, name, version):
+ installed_versions = self._installed_versions(name)
+ if not installed_versions:
+ return INDETERMINATE_DEPENDENCY
+
+ # Just grab newest installed version - may make sense some day to find
+ # the linked version instead.
+ default_version = sorted(installed_versions, reverse=True)[0]
+ return self._find_dep_versioned(name, default_version)
+
+ def _installed_versions(self, recipe):
+ recipe_base_path = os.path.join(self.cellar_root, recipe)
+ if not os.path.exists(recipe_base_path):
+ return []
+
+ names = os.listdir(recipe_base_path)
+ return filter(lambda n: os.path.isdir(os.path.join(recipe_base_path, n)), names)
+
+
+class HomebrewDependency(Dependency):
+
+ def __init__(self, commands):
+ self.commands = commands
+
+ def shell_commands(self, requirement):
+ return self.commands.replace("\n", ";") + "\n"
+
+
+def _string_as_bool( value ):
+ return str( value ).lower() == "true"
+
+
+__all__ = [HomebrewDependencyResolver]
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/lint.py
--- /dev/null
+++ b/lib/galaxy/tools/lint.py
@@ -0,0 +1,89 @@
+from __future__ import print_function
+import inspect
+from galaxy.util import submodules
+
+LEVEL_ALL = "all"
+LEVEL_WARN = "warn"
+LEVEL_ERROR = "error"
+
+
+def lint_xml(tool_xml, level=LEVEL_ALL, fail_level=LEVEL_WARN):
+ import galaxy.tools.linters
+ lint_context = LintContext(level=level)
+ linter_modules = submodules.submodules(galaxy.tools.linters)
+ for module in linter_modules:
+ for (name, value) in inspect.getmembers(module):
+ if callable(value) and name.startswith("lint_"):
+ lint_context.lint(module, name, value, tool_xml)
+ found_warns = lint_context.found_warns
+ found_errors = lint_context.found_errors
+ if level == LEVEL_WARN and (found_warns or found_errors):
+ return False
+ else:
+ return found_errors
+
+
+class LintContext(object):
+
+ def __init__(self, level):
+ self.level = level
+ self.found_errors = False
+ self.found_warns = False
+
+ def lint(self, module, name, lint_func, tool_xml):
+ self.printed_linter_info = False
+ self.valid_messages = []
+ self.info_messages = []
+ self.warn_messages = []
+ self.error_messages = []
+ lint_func(tool_xml, self)
+ # TODO: colorful emoji if in click CLI.
+ if self.error_messages:
+ status = "FAIL"
+ elif self.warn_messages:
+
+ status = "WARNING"
+ else:
+ status = "CHECK"
+
+ def print_linter_info():
+ if self.printed_linter_info:
+ return
+ self.printed_linter_info = True
+ print("Applying linter %s... %s" % (name, status))
+
+ for message in self.error_messages:
+ self.found_errors = True
+ print_linter_info()
+ print(".. ERROR: %s" % message)
+
+ if self.level != LEVEL_ERROR:
+ for message in self.warn_messages:
+ self.found_warns = True
+ print_linter_info()
+ print(".. WARNING: %s" % message)
+
+ if self.level == LEVEL_ALL:
+ for message in self.info_messages:
+ print_linter_info()
+ print(".. INFO: %s" % message)
+ for message in self.valid_messages:
+ print_linter_info()
+ print(".. CHECK: %s" % message)
+
+ def __handle_message(self, message_list, message, *args):
+ if args:
+ message = message % args
+ message_list.append(message)
+
+ def valid(self, message, *args):
+ self.__handle_message(self.valid_messages, message, *args)
+
+ def info(self, message, *args):
+ self.__handle_message(self.info_messages, message, *args)
+
+ def error(self, message, *args):
+ self.__handle_message(self.error_messages, message, *args)
+
+ def warn(self, message, *args):
+ self.__handle_message(self.warn_messages, message, *args)
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/linters/__init__.py
--- /dev/null
+++ b/lib/galaxy/tools/linters/__init__.py
@@ -0,0 +1,2 @@
+""" Framework for linting tools.
+"""
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/linters/citations.py
--- /dev/null
+++ b/lib/galaxy/tools/linters/citations.py
@@ -0,0 +1,27 @@
+
+
+def lint_citations(tool_xml, lint_ctx):
+ root = tool_xml.getroot()
+ citations = root.findall("citations")
+ if len(citations) > 1:
+ lint_ctx.error("More than one citation section found, behavior undefined.")
+ return
+
+ if len(citations) == 0:
+ lint_ctx.warn("No citations found, consider adding citations to your tool.")
+ return
+
+ valid_citations = 0
+ for citation in citations[0]:
+ if citation.tag != "citation":
+ lint_ctx.warn("Unknown tag discovered in citations block [%s], will be ignored." % citation.tag)
+ if "type" in citation.attrib:
+ citation_type = citation.attrib.get("type")
+ if citation_type not in ["doi", "bibtex"]:
+ lint_ctx.warn("Unknown citation type discovered [%s], will be ignored.", citation_type)
+ else:
+ valid_citations += 1
+
+ if valid_citations > 0:
+ lint_ctx.valid("Found %d likely valid citations.", valid_citations)
+
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/linters/help.py
--- /dev/null
+++ b/lib/galaxy/tools/linters/help.py
@@ -0,0 +1,15 @@
+
+
+def lint_help(tool_xml, lint_ctx):
+ root = tool_xml.getroot()
+ helps = root.findall("help")
+ if len(helps) > 1:
+ lint_ctx.error("More than one help section found, behavior undefined.")
+ return
+
+ if len(helps) == 0:
+ lint_ctx.warn("No help section found, consider adding a help section to your tool.")
+ return
+
+ # TODO: validate help section RST.
+ lint_ctx.valid("Tool contains help section.")
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/linters/inputs.py
--- /dev/null
+++ b/lib/galaxy/tools/linters/inputs.py
@@ -0,0 +1,38 @@
+
+
+def lint_inputs(tool_xml, lint_ctx):
+ inputs = tool_xml.findall("./inputs//param")
+ num_inputs = 0
+ for param in inputs:
+ num_inputs += 1
+ param_attrib = param.attrib
+ has_errors = False
+ if "type" not in param_attrib:
+ lint_ctx.error("Found param input with type specified.")
+ has_errors = True
+ if "name" not in param_attrib:
+ lint_ctx.error("Found param input with not name specified.")
+ has_errors = True
+
+ if has_errors:
+ continue
+
+ param_type = param_attrib["type"]
+ param_name = param_attrib["name"]
+ if param_type == "data_input":
+ if "format" not in param_attrib:
+ lint_ctx.warn("Found param input %s contains no format specified - 'data' format will be assumed.", param_name)
+ # TODO: Validate type, much more...
+ if num_inputs:
+ lint_ctx.info("Found %d input parameters.", num_inputs)
+ else:
+ lint_ctx.warn("Found not input parameters.")
+
+
+def lint_repeats(tool_xml, lint_ctx):
+ repeats = tool_xml.findall("./inputs//repeat")
+ for repeat in repeats:
+ if "name" not in repeat.attrib:
+ lint_ctx.error("Repeat does not specify name attribute.")
+ if "title" not in repeat.attrib:
+ lint_ctx.error("Repeat does not specify title attribute.")
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/linters/outputs.py
--- /dev/null
+++ b/lib/galaxy/tools/linters/outputs.py
@@ -0,0 +1,25 @@
+
+
+def lint_output(tool_xml, lint_ctx):
+ outputs = tool_xml.findall("./outputs/data")
+ if not outputs:
+ lint_ctx.warn("Tool contains no outputs, most tools should produce outputs..")
+ return
+
+ num_outputs = 0
+ for output in outputs:
+ num_outputs += 1
+ output_attrib = output.attrib
+ format_set = False
+ if "format" in output_attrib:
+ format_set = True
+ format = output_attrib["format"]
+ if format == "input":
+ lint_ctx.warn("Using format='input' on output data, format_source attribute is less ambigious and should be used instead.")
+ elif "format_source" in output_attrib:
+ format_set = True
+
+ if not format_set:
+ lint_ctx.warn("Tool data output doesn't define an output format.")
+
+ lint_ctx.info("%d output datasets found.", num_outputs)
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/linters/tests.py
--- /dev/null
+++ b/lib/galaxy/tools/linters/tests.py
@@ -0,0 +1,20 @@
+
+
+# Misspelled so as not be picked up by nosetests.
+def lint_tsts(tool_xml, lint_ctx):
+ tests = tool_xml.findall("./tests/test")
+ if not tests:
+ lint_ctx.warn("No tests found, most tools should define test cases.")
+
+ num_valid_tests = 0
+ for test in tests:
+ outputs = test.findall("output")
+ if not outputs:
+ lint_ctx.warn("No outputs defined for tests, this test is likely invalid.")
+ else:
+ num_valid_tests += 1
+
+ if num_valid_tests:
+ lint_ctx.valid("%d test(s) found.", num_valid_tests)
+ else:
+ lint_ctx.warn("No valid test(s) found.")
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/linters/top_level.py
--- /dev/null
+++ b/lib/galaxy/tools/linters/top_level.py
@@ -0,0 +1,17 @@
+
+def lint_top_level(tree, lint_ctx):
+ root = tree.getroot()
+ if "version" not in root.attrib:
+ lint_ctx.error("Tool does not define a version attribute.")
+ else:
+ lint_ctx.valid("Tool defines a version.")
+
+ if "name" not in root.attrib:
+ lint_ctx.error("Tool does not define a name attribute.")
+ else:
+ lint_ctx.valid("Tool defines a name.")
+
+ if "id" not in root.attrib:
+ lint_ctx.error("Tool does not define an id attribute.")
+ else:
+ lint_ctx.valid("Tool defines an id name.")
diff -r 47171159e9c05fba08a6ed5e6f2883c3ef8e1ead -r 6b7782f17e84b357c968c7e8e14d1f50c3668008 lib/galaxy/tools/loader_directory.py
--- /dev/null
+++ b/lib/galaxy/tools/loader_directory.py
@@ -0,0 +1,31 @@
+import glob
+import os
+from ..tools import loader
+
+PATH_DOES_NOT_EXIST_ERROR = "Could not load tools from path [%s] - this path does not exist."
+
+
+def load_tool_elements_from_path(path):
+ tool_elements = []
+ for file in __find_tool_files(path):
+ if __looks_like_a_tool(file):
+ tool_elements.append((file, loader.load_tool(file)))
+ return tool_elements
+
+
+def __looks_like_a_tool(path):
+ with open(path) as f:
+ for i in range(10):
+ line = f.next()
+ if "<tool" in line:
+ return True
+ return False
+
+
+def __find_tool_files(path):
+ if not os.path.exists(path):
+ raise Exception(PATH_DOES_NOT_EXIST_ERROR)
+ if not os.path.isdir(path):
+ return [os.path.abspath(path)]
+ else:
+ return map(os.path.abspath, glob.glob(path + "/**.xml"))
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jmchilton: Update master API key handling in tests for d7dd1f9.
by commits-noreply@bitbucket.org 06 Oct '14
by commits-noreply@bitbucket.org 06 Oct '14
06 Oct '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/9cad38925c8d/
Changeset: 9cad38925c8d
User: jmchilton
Date: 2014-10-07 01:25:30+00:00
Summary: Update master API key handling in tests for d7dd1f9.
Now aware of anyone actually overriding this with GALAXY_TEST_MASTER_API_KEY so not maintaining backward compatibility with respect to that environment variable.
Affected #: 1 file
diff -r d7dd1f92d5b8e35ed9530fc5f0494e160ff8c4da -r 9cad38925c8da6221a1ce9817d382421c6627ecd test/base/api_util.py
--- a/test/base/api_util.py
+++ b/test/base/api_util.py
@@ -9,7 +9,11 @@
configured as a master API key and should be able to create additional
users and keys.
"""
- return os.environ.get( "GALAXY_TEST_MASTER_API_KEY", DEFAULT_GALAXY_MASTER_API_KEY )
+ for key in ["GALAXY_CONFIG_MASTER_API_KEY", "GALAXY_CONFIG_OVERRIDE_MASTER_API_KEY"]:
+ value = os.environ.get(key, None)
+ if value:
+ return value
+ return DEFAULT_GALAXY_MASTER_API_KEY
def get_user_api_key():
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jmchilton: Make functional tests to respect GALAXY_CONFIG_ environment variables.
by commits-noreply@bitbucket.org 06 Oct '14
by commits-noreply@bitbucket.org 06 Oct '14
06 Oct '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/d7dd1f92d5b8/
Changeset: d7dd1f92d5b8
User: jmchilton
Date: 2014-10-07 00:56:58+00:00
Summary: Make functional tests to respect GALAXY_CONFIG_ environment variables.
Functional tests don't work without the tweak to imports in cloudlaunch - not sure if that has always been a problem and I am just running the tests in a different Galaxy instance or if the earlier import of util caused the problem.
Regardless I guess we shoud update to a version of bioblend that doesn't require simplejson.
Affected #: 2 files
diff -r 594b48fe90b7b17ed22f5c597837022d8204db47 -r d7dd1f92d5b8e35ed9530fc5f0494e160ff8c4da lib/galaxy/webapps/galaxy/controllers/cloudlaunch.py
--- a/lib/galaxy/webapps/galaxy/controllers/cloudlaunch.py
+++ b/lib/galaxy/webapps/galaxy/controllers/cloudlaunch.py
@@ -17,6 +17,7 @@
eggs.require('PyYAML')
eggs.require('boto')
+eggs.require('simplejson')
eggs.require('bioblend')
from boto.exception import EC2ResponseError
diff -r 594b48fe90b7b17ed22f5c597837022d8204db47 -r d7dd1f92d5b8e35ed9530fc5f0494e160ff8c4da scripts/functional_tests.py
--- a/scripts/functional_tests.py
+++ b/scripts/functional_tests.py
@@ -16,6 +16,7 @@
from base.tool_shed_util import parse_tool_panel_config
from galaxy import eggs
+from galaxy.util.properties import load_app_properties
eggs.require( "nose" )
eggs.require( "NoseHTML" )
@@ -401,6 +402,9 @@
kwargs[ 'global_conf' ] = get_webapp_global_conf()
kwargs[ 'global_conf' ][ '__file__' ] = galaxy_config_file
kwargs[ 'config_file' ] = galaxy_config_file
+ kwargs = load_app_properties(
+ kwds=kwargs
+ )
# Build the Universe Application
app = UniverseApplication( **kwargs )
database_contexts.galaxy_context = app.model.context
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: natefoo: Added tag latest_2014.10.06 for changeset 2092948937ac
by commits-noreply@bitbucket.org 06 Oct '14
by commits-noreply@bitbucket.org 06 Oct '14
06 Oct '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/e98b33105381/
Changeset: e98b33105381
Branch: stable
User: natefoo
Date: 2014-10-06 19:31:09+00:00
Summary: Added tag latest_2014.10.06 for changeset 2092948937ac
Affected #: 1 file
diff -r 3b3cd242b4b7cd20b9c868c393c455524b31b87c -r e98b33105381ac94f590edee1859ab550d328833 .hgtags
--- a/.hgtags
+++ b/.hgtags
@@ -20,3 +20,4 @@
ca45b78adb4152fc6e7395514d46eba6b7d0b838 release_2014.08.11
548ab24667d6206780237bd807f7d857a484c461 latest_2014.08.11
2092948937ac30ef82f71463a235c66d34987088 release_2014.10.06
+2092948937ac30ef82f71463a235c66d34987088 latest_2014.10.06
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
Branch: refs/heads/master
Home: https://github.com/galaxyproject/usegalaxy-playbook
Commit: 7899e122b8cb0ab19b0de9e565822f9d6c1bd844
https://github.com/galaxyproject/usegalaxy-playbook/commit/7899e122b8cb0ab1…
Author: Nate Coraor <nate(a)bx.psu.edu>
Date: 2014-10-06 (Mon, 06 Oct 2014)
Changed paths:
M production/group_vars/all.yml
Log Message:
-----------
Update Main.
1
0
3 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/769e8dca49f0/
Changeset: 769e8dca49f0
Branch: next-stable
User: natefoo
Date: 2014-10-06 16:58:28+00:00
Summary: Close next-stable branch for release_2014.10.06
Affected #: 0 files
https://bitbucket.org/galaxy/galaxy-central/commits/2092948937ac/
Changeset: 2092948937ac
Branch: stable
User: natefoo
Date: 2014-10-06 16:58:53+00:00
Summary: Merge next-stable to stable for release_2014.10.06
Affected #: 951 files
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 README.txt
--- a/README.txt
+++ b/README.txt
@@ -1,31 +1,34 @@
-GALAXY
-======
-http://galaxyproject.org/
-
-The latest information about Galaxy is always available via the Galaxy
-website above.
-
-HOW TO START
-============
-Galaxy requires Python 2.6 or 2.7. To check your python version, run:
-
-% python -V
-Python 2.7.3
-
-Start Galaxy:
-
-% sh run.sh
-
-Once Galaxy completes startup, you should be able to view Galaxy in your
-browser at:
-
-http://localhost:8080
-
-You may wish to make changes from the default configuration. This can be done
-in the universe_wsgi.ini file. Tools are configured in tool_conf.xml. Details
-on adding tools can be found on the Galaxy website (linked above).
-
-Not all dependencies are included for the tools provided in the sample
-tool_conf.xml. A full list of external dependencies is available at:
-
-https://wiki.galaxyproject.org/Admin/Tools/ToolDependencies
+GALAXY
+======
+http://galaxyproject.org/
+
+The latest information about Galaxy is always available via the Galaxy
+website above.
+
+HOW TO START
+============
+Galaxy requires Python 2.6 or 2.7. To check your python version, run:
+
+% python -V
+Python 2.7.3
+
+Start Galaxy:
+
+% sh run.sh
+
+Once Galaxy completes startup, you should be able to view Galaxy in your
+browser at:
+
+http://localhost:8080
+
+You may wish to make changes from the default configuration. This can be done
+in the config/galaxy.ini file. Tools can be either installed from the Tool Shed
+or added manually. For details please see the Galaxy wiki:
+
+https://wiki.galaxyproject.org/Admin/Tools/AddToolFromToolShedTutorial
+
+
+Not all dependencies are included for the tools provided in the sample
+tool_conf.xml. A full list of external dependencies is available at:
+
+https://wiki.galaxyproject.org/Admin/Tools/ToolDependencies
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 buildbot_setup.sh
--- a/buildbot_setup.sh
+++ b/buildbot_setup.sh
@@ -42,25 +42,6 @@
/galaxy/software/tool-data/gatk
"
-SAMPLES="
-tool_conf.xml.sample
-datatypes_conf.xml.sample
-universe_wsgi.ini.sample
-tool_data_table_conf.xml.sample
-tool_sheds_conf.xml.sample
-shed_tool_data_table_conf.xml.sample
-migrated_tools_conf.xml.sample
-data_manager_conf.xml.sample
-shed_data_manager_conf.xml.sample
-tool-data/shared/ensembl/builds.txt.sample
-tool-data/shared/igv/igv_build_sites.txt.sample
-tool-data/shared/ncbi/builds.txt.sample
-tool-data/shared/rviewer/rviewer_build_sites.txt.sample
-tool-data/shared/ucsc/builds.txt.sample
-tool-data/shared/ucsc/publicbuilds.txt.sample
-tool-data/shared/ucsc/ucsc_build_sites.txt.sample
-"
-
DIRS="
database
database/files
@@ -108,14 +89,11 @@
;;
esac
-for sample in $SAMPLES; do
- file=${sample%.sample}
- echo "Copying $sample to $file"
- cp $sample $file
-done
+# set up configs from samples.
+./scripts/common_startup.sh
echo "Copying job_conf.xml.sample_basic to job_conf.xml"
-cp job_conf.xml.sample_basic job_conf.xml
+cp config/job_conf.xml.sample_basic config/job_conf.xml
for dir in $DIRS; do
if [ ! -d $dir ]; then
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/GruntFile.js
--- /dev/null
+++ b/client/GruntFile.js
@@ -0,0 +1,88 @@
+module.exports = function(grunt) {
+
+ // Project configuration.
+ grunt.initConfig({
+ pkg: grunt.file.readJSON( 'package.json' ),
+
+ // default task
+ // use 'grunt copy' to copy the entire <galaxy>/client/galaxy/scripts dir into <galaxy>/static/scripts
+ copy: {
+ main: {
+ files: [
+ {
+ expand : true,
+ cwd : 'galaxy/scripts/',
+ src : '**',
+ dest : '../static/scripts'
+ }
+ ]
+ }
+ },
+
+ // use 'grunt pack' to call pack_scripts.py to pack all or selected files in static/scripts
+ exec: {
+ packScripts: {
+ cwd: '../static/scripts',
+ target: [],
+ cmd: function(){
+ var targets = grunt.config( 'exec.packScripts.target' );
+ // if nothing was passed in pack all scripts
+ if( !targets.length ){
+ return './pack_scripts.py';
+ }
+
+ grunt.log.write( 'packing: ' + targets + '\n' );
+ return targets.map( function( target ){
+ return './pack_scripts.py ' + target;
+ }).join( '; ' );
+ }
+ }
+ },
+
+ // use 'grunt watch' (from a new tab in your terminal) to have grunt re-copy changed files automatically
+ watch: {
+ // watch for changes in the src dir
+ files: [ 'galaxy/scripts/**' ],
+ tasks: [ 'copy', 'pack' ],
+ options: {
+ spawn: false
+ }
+ }
+ });
+
+ grunt.loadNpmTasks( 'grunt-contrib-watch' );
+ grunt.loadNpmTasks( 'grunt-contrib-copy');
+ grunt.loadNpmTasks('grunt-exec');
+
+ grunt.registerTask( 'pack', [ 'exec' ] );
+ grunt.registerTask( 'default', [ 'copy', 'pack' ] );
+
+ // -------------------------------------------------------------------------- copy,pack only those changed
+ // adapted from grunt-contrib-watch jslint example
+ //TODO: a bit hacky and there's prob. a better way
+ //NOTE: copy will fail silently if a file isn't found
+
+ // outer scope variable for the event handler and onChange fn - begin with empty hash
+ var changedFiles = Object.create(null);
+
+ // when files are changed, set the copy src and packScripts target to the filenames of the updated files
+ var onChange = grunt.util._.debounce(function() {
+ grunt.config( 'copy.main.files', [{
+ expand: true,
+ cwd: 'galaxy/scripts',
+ src: Object.keys( changedFiles ),
+ dest: '../static/scripts/'
+ }]);
+ grunt.config( 'exec.packScripts.target', Object.keys( changedFiles ) );
+ changedFiles = Object.create(null);
+ }, 200);
+
+ grunt.event.on('watch', function(action, filepath) {
+ // store each filepath in a Files obj, the debounced fn above will use it as an aggregate list for copying
+ // we need to take galaxy/scripts out of the filepath or it will be copied to the wrong loc
+ filepath = filepath.replace( /galaxy\/scripts\//, '' );
+ changedFiles[filepath] = action;
+ onChange();
+ });
+
+};
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/README.txt
--- /dev/null
+++ b/client/README.txt
@@ -0,0 +1,42 @@
+Client Build System
+===================
+
+Builds and moves the client-side scripts necessary for running the Galaxy webapps. There's no need to use this system
+unless you are modifying or developing client-side scripts.
+
+You'll need Node and the Node Package Manager (npm): nodejs.org.
+
+Once npm is installed, install the grunt task manager and it's command line into your global scope:
+
+ npm install -g grunt grunt-cli
+
+Next, from within this directory, install the local build dependencies:
+
+ cd client
+ npm install
+
+You're now ready to re-build the client scripts after modifying them.
+
+
+Rebuilding
+==========
+
+There are two methods for rebuilding: a complete rebuild and automatic, partial rebuilds while you develop.
+
+A complete rebuild can be done with the following (from the `client` directory):
+
+ grunt
+
+This will copy any files in `client/galaxy/scripts` to `static/scripts` and run `static/scripts/pack_scripts.py` on all.
+
+Grunt can also do an automatic, partial rebuild of any files you change *as you develop* by:
+
+ 1. opening a new terminal session
+ 2. `cd client`
+ 3. `grunt watch`
+
+This starts a new grunt watch process that will monitor the files in `client/galaxy/scripts` for changes and copy and
+pack them when they change.
+
+You can stop the `grunt watch` task by pressing `Ctrl+C`. Note: you should also be able to background that task if you
+prefer.
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/galaxy/scripts/base.js
--- /dev/null
+++ b/client/galaxy/scripts/base.js
@@ -0,0 +1,15 @@
+define( ["libs/backbone/backbone"], function( Backbone ) {
+
+ var Base = function() {
+ if( this.initialize ) {
+ this.initialize.apply(this, arguments);
+ }
+ };
+ Base.extend = Backbone.Model.extend;
+
+ return {
+ Base: Base,
+ Backbone: Backbone
+ };
+
+});
\ No newline at end of file
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/galaxy/scripts/galaxy-app-base.js
--- /dev/null
+++ b/client/galaxy/scripts/galaxy-app-base.js
@@ -0,0 +1,160 @@
+define([
+ 'mvc/user/user-model',
+ 'utils/metrics-logger',
+ 'utils/add-logging',
+ 'utils/localization',
+ 'bootstrapped-data'
+], function( userModel, metricsLogger, addLogging, localize, bootstrapped ){
+// ============================================================================
+/** Base galaxy client-side application.
+ * Iniitializes:
+ * logger : the logger/metrics-logger
+ * localize : the string localizer
+ * config : the current configuration (any k/v in
+ * galaxy.ini available from the configuration API)
+ * user : the current user (as a mvc/user/user-model)
+ */
+function GalaxyApp( options ){
+ var self = this;
+ return self._init( options || {} );
+}
+
+// add logging shortcuts for this object
+addLogging( GalaxyApp, 'GalaxyApp' );
+
+/** default options */
+GalaxyApp.prototype.defaultOptions = {
+ /** monkey patch attributes from existing window.Galaxy object? */
+ patchExisting : true,
+ /** root url of this app */
+ // move to self.root?
+ root : '/'
+};
+
+/** initalize options and sub-components */
+GalaxyApp.prototype._init = function init( options ){
+ var self = this;
+ _.extend( self, Backbone.Events );
+
+ self._processOptions( options );
+ self.debug( 'GalaxyApp.options: ', self.options );
+
+ self._patchGalaxy( window.Galaxy );
+
+ self._initLogger( options.loggerOptions || {} );
+ self.debug( 'GalaxyApp.logger: ', self.logger );
+
+ self._initLocale();
+ self.debug( 'GalaxyApp.localize: ', self.localize );
+
+ self.config = options.config || bootstrapped.config || {};
+ self.debug( 'GalaxyApp.config: ', self.config );
+
+ self._initUser( options.user || bootstrapped.user || {} );
+ self.debug( 'GalaxyApp.user: ', self.user );
+
+ //TODO: temp
+ self.trigger( 'ready', self );
+ //if( typeof options.onload === 'function' ){
+ // options.onload();
+ //}
+
+ self._setUpListeners();
+
+ return self;
+};
+
+/** add an option from options if the key matches an option in defaultOptions */
+GalaxyApp.prototype._processOptions = function _processOptions( options ){
+ var self = this,
+ defaults = self.defaultOptions;
+ self.debug( '_processOptions: ', options );
+
+ self.options = {};
+ for( var k in defaults ){
+ if( defaults.hasOwnProperty( k ) ){
+ self.options[ k ] = ( options.hasOwnProperty( k ) )?( options[ k ] ):( defaults[ k ] );
+ }
+ }
+ return self;
+};
+
+/** add an option from options if the key matches an option in defaultOptions */
+GalaxyApp.prototype._patchGalaxy = function _processOptions( patchWith ){
+ var self = this;
+ // in case req or plain script tag order has created a prev. version of the Galaxy obj...
+ if( self.options.patchExisting && patchWith ){
+ self.debug( 'found existing Galaxy object:', patchWith );
+ // ...(for now) monkey patch any added attributes that the previous Galaxy may have had
+ //TODO: move those attributes to more formal assignment in GalaxyApp
+ for( var k in patchWith ){
+ if( patchWith.hasOwnProperty( k ) ){
+ self.debug( '\t patching in ' + k + ' to Galaxy' );
+ self[ k ] = patchWith[ k ];
+ }
+ }
+ }
+};
+
+/** set up the metrics logger (utils/metrics-logger) and pass loggerOptions */
+GalaxyApp.prototype._initLogger = function _initLogger( loggerOptions ){
+ var self = this;
+ self.debug( '_initLogger:', loggerOptions );
+ self.logger = new metricsLogger.MetricsLogger( loggerOptions );
+ return self;
+};
+
+/** add the localize fn to this object and the window namespace (as '_l') */
+GalaxyApp.prototype._initLocale = function _initLocale( options ){
+ var self = this;
+ self.debug( '_initLocale:', options );
+ self.localize = localize;
+ // add to window as global shortened alias
+ window._l = self.localize;
+ return self;
+};
+
+/** set up the current user as a Backbone model (mvc/user/user-model) */
+GalaxyApp.prototype._initUser = function _initUser( userJSON ){
+ var self = this;
+ self.debug( '_initUser:', userJSON );
+ self.user = new userModel.User( userJSON );
+ //TODO: temp - old alias
+ self.currUser = self.user;
+ return self;
+};
+
+/** Set up DOM/jQuery/Backbone event listeners enabled for all pages */
+GalaxyApp.prototype._setUpListeners = function _setUpListeners(){
+ var self = this;
+
+ // hook to jq beforeSend to record the most recent ajax call and cache some data about it
+ /** cached info about the last ajax call made through jQuery */
+ self.lastAjax = {};
+ $( document ).bind( 'ajaxSend', function( ev, xhr, options ){
+ var data = options.data;
+ try {
+ data = JSON.parse( data );
+ } catch( err ){}
+
+ self.lastAjax = {
+ url : location.href.slice( 0, -1 ) + options.url,
+ data : data
+ };
+ //TODO:?? we might somehow manage to *retry* ajax using either this hook or Backbone.sync
+ });
+
+};
+
+/** string rep */
+GalaxyApp.prototype.toString = function toString(){
+ var userEmail = this.user.get( 'email' ) || '(anonymous)';
+ return 'GalaxyApp(' + userEmail + ')';
+};
+
+
+// ============================================================================
+ return {
+ GalaxyApp : GalaxyApp
+ };
+});
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/galaxy/scripts/galaxy.autocom_tagging.js
--- /dev/null
+++ b/client/galaxy/scripts/galaxy.autocom_tagging.js
@@ -0,0 +1,368 @@
+/**
+* JQuery extension for tagging with autocomplete.
+* @author: Jeremy Goecks
+* @require: jquery.autocomplete plugin
+*/
+//
+// Initialize "tag click functions" for tags.
+//
+function init_tag_click_function(tag_elt, click_func) {
+ $(tag_elt).find('.tag-name').each( function() {
+ $(this).click( function() {
+ var tag_str = $(this).text();
+ var tag_name_and_value = tag_str.split(":");
+ click_func(tag_name_and_value[0], tag_name_and_value[1]);
+ return true;
+ });
+ });
+}
+
+jQuery.fn.autocomplete_tagging = function(options) {
+
+ var defaults = {
+ get_toggle_link_text_fn: function(tags) {
+ var text = "";
+ var num_tags = obj_length(tags);
+ if (num_tags > 0) {
+ text = num_tags + (num_tags > 1 ? " Tags" : " Tag");
+ } else {
+ text = "Add tags";
+ }
+ return text;
+ },
+ tag_click_fn : function (name, value) {},
+ editable: true,
+ input_size: 20,
+ in_form: false,
+ tags : {},
+ use_toggle_link: true,
+ item_id: "",
+ add_tag_img: "",
+ add_tag_img_rollover: "",
+ delete_tag_img: "",
+ ajax_autocomplete_tag_url: "",
+ ajax_retag_url: "",
+ ajax_delete_tag_url: "",
+ ajax_add_tag_url: ""
+ };
+
+ var settings = jQuery.extend(defaults, options);
+
+ //
+ // Initalize object's elements.
+ //
+
+ // Get elements for this object. For this_obj, assume the last element with the id is the "this"; this is somewhat of a hack to address the problem
+ // that there may be two tagging elements for a single item if there are both community and individual tags for an element.
+ var this_obj = $(this);
+ var tag_area = this_obj.find('.tag-area');
+ var toggle_link = this_obj.find('.toggle-link');
+ var tag_input_field = this_obj.find('.tag-input');
+ var add_tag_button = this_obj.find('.add-tag-button');
+
+ // Initialize toggle link.
+ toggle_link.click( function() {
+ // Take special actions depending on whether toggle is showing or hiding link.
+ var after_toggle_fn;
+ if (tag_area.is(":hidden")) {
+ after_toggle_fn = function() {
+ // If there are no tags, go right to editing mode by generating a click on the area.
+ var num_tags = $(this).find('.tag-button').length;
+ if (num_tags === 0) {
+ tag_area.click();
+ }
+ };
+ } else {
+ after_toggle_fn = function() {
+ tag_area.blur();
+ };
+ }
+ tag_area.slideToggle("fast", after_toggle_fn);
+ return $(this);
+ });
+
+ // Initialize tag input field.
+ if (settings.editable) {
+ tag_input_field.hide();
+ }
+ tag_input_field.keyup( function(e) {
+ if ( e.keyCode === 27 ) {
+ // Escape key
+ $(this).trigger( "blur" );
+ } else if (
+ ( e.keyCode === 13 ) || // Return Key
+ ( e.keyCode === 188 ) || // Comma
+ ( e.keyCode === 32 ) // Space
+ ) {
+ //
+ // Check input.
+ //
+
+ var new_value = this.value;
+
+ // Suppress space after a ":"
+ if ( new_value.indexOf(": ", new_value.length - 2) !== -1) {
+ this.value = new_value.substring(0, new_value.length-1);
+ return false;
+ }
+
+ // Remove trigger keys from input.
+ if ( (e.keyCode === 188) || (e.keyCode === 32) ) {
+ new_value = new_value.substring( 0 , new_value.length - 1 );
+ }
+
+ // Trim whitespace.
+ new_value = $.trim(new_value);
+
+ // Too short?
+ if (new_value.length < 2) {
+ return false;
+ }
+
+ //
+ // New tag OK - apply it.
+ //
+
+ this.value = ""; // Reset text field now that tag is being added
+
+ // Add button for tag after all other tag buttons.
+ var new_tag_button = build_tag_button(new_value);
+ var tag_buttons = tag_area.children(".tag-button");
+ if (tag_buttons.length !== 0) {
+ var last_tag_button = tag_buttons.slice(tag_buttons.length-1);
+ last_tag_button.after(new_tag_button);
+ } else {
+ tag_area.prepend(new_tag_button);
+ }
+
+ // Add tag to internal list.
+ var tag_name_and_value = new_value.split(":");
+ settings.tags[tag_name_and_value[0]] = tag_name_and_value[1];
+
+ // Update toggle link text.
+ var new_text = settings.get_toggle_link_text_fn(settings.tags);
+ toggle_link.text(new_text);
+
+ // Commit tag to server.
+ var zz = $(this);
+ $.ajax({
+ url: settings.ajax_add_tag_url,
+ data: { new_tag: new_value },
+ error: function() {
+ // Failed. Roll back changes and show alert.
+ new_tag_button.remove();
+ delete settings.tags[tag_name_and_value[0]];
+ var new_text = settings.get_toggle_link_text_fn(settings.tags);
+ toggle_link.text(new_text);
+ alert( "Add tag failed" );
+ },
+ success: function() {
+ // Flush autocomplete cache because it's not out of date.
+ // TODO: in the future, we could remove the particular item
+ // that was chosen from the cache rather than flush it.
+ zz.data('autocompleter').cacheFlush();
+ }
+ });
+
+ return false;
+ }
+ });
+
+ // Add autocomplete to input.
+ var format_item_func = function(key, row_position, num_rows, value, search_term) {
+ var tag_name_and_value = value.split(":");
+ return (tag_name_and_value.length === 1 ? tag_name_and_value[0] : tag_name_and_value[1]);
+ };
+ var autocomplete_options = { selectFirst: false, formatItem: format_item_func,
+ autoFill: false, highlight: false };
+ tag_input_field.autocomplete(settings.ajax_autocomplete_tag_url, autocomplete_options);
+
+
+ // Initialize delete tag images for current tags.
+ this_obj.find('.delete-tag-img').each(function() {
+ init_delete_tag_image( $(this) );
+ });
+
+
+ // Initialize tag click function.
+ init_tag_click_function($(this), settings.tag_click_fn);
+
+ // Initialize "add tag" button.
+ add_tag_button.click( function() {
+ $(this).hide();
+
+ // Clicking on button is the same as clicking on the tag area.
+ tag_area.click();
+ return false;
+ });
+
+ //
+ // Set up tag area interactions; these are needed only if tags are editable.
+ //
+ if (settings.editable) {
+ // When the tag area blurs, go to "view tag" mode.
+ tag_area.bind("blur", function(e) {
+ if (obj_length(settings.tags) > 0) {
+ add_tag_button.show();
+ tag_input_field.hide();
+ tag_area.removeClass("active-tag-area");
+ // tag_area.addClass("tooltip");
+ } else {
+ // No tags, so do nothing to ensure that input is still visible.
+ }
+ });
+
+ // On click, enable user to add tags.
+ tag_area.click( function(e) {
+ var is_active = $(this).hasClass("active-tag-area");
+
+ // If a "delete image" object was pressed and area is inactive, do nothing.
+ if ($(e.target).hasClass("delete-tag-img") && !is_active) {
+ return false;
+ }
+
+ // If a "tag name" object was pressed and area is inactive, do nothing.
+ if ($(e.target).hasClass("tag-name") && !is_active) {
+ return false;
+ }
+
+ // Remove tooltip.
+ // $(this).removeClass("tooltip");
+
+ // Hide add tag button, show tag_input field. Change background to show
+ // area is active.
+ $(this).addClass("active-tag-area");
+ add_tag_button.hide();
+ tag_input_field.show();
+ tag_input_field.focus();
+
+ // Add handler to document that will call blur when the tag area is blurred;
+ // a tag area is blurred when a user clicks on an element outside the area.
+ var handle_document_click = function(e) {
+ var check_click = function(tag_area, target) {
+ var tag_area_id = tag_area.attr("id");
+ // Blur the tag area if the element clicked on is not in the tag area.
+ if (target !== tag_area) {
+ tag_area.blur();
+ $(window).unbind("click.tagging_blur");
+ $(this).addClass("tooltip");
+ }
+ };
+ check_click(tag_area, $(e.target));
+ };
+ // TODO: we should attach the click handler to all frames in order to capture
+ // clicks outside the frame that this element is in.
+ //window.parent.document.onclick = handle_document_click;
+ //var temp = $(window.parent.document.body).contents().find("iframe").html();
+ //alert(temp);
+ //$(document).parent().click(handle_document_click);
+ $(window).bind("click.tagging_blur", handle_document_click);
+
+ return false;
+ });
+ }
+
+ // If using toggle link, hide the tag area. Otherwise, show the tag area.
+ if (settings.use_toggle_link) {
+ tag_area.hide();
+ }
+
+ //
+ // Helper functions.
+ //
+
+ //
+ // Collapse tag name + value into a single string.
+ //
+ function build_tag_str(tag_name, tag_value) {
+ return tag_name + ( tag_value ? ":" + tag_value : "");
+ }
+
+
+ // Initialize a "delete tag image": when click, delete tag from UI and send delete request to server.
+ function init_delete_tag_image(delete_img) {
+ $(delete_img).mouseenter( function () {
+ $(this).attr("src", settings.delete_tag_img_rollover);
+ });
+ $(delete_img).mouseleave( function () {
+ $(this).attr("src", settings.delete_tag_img);
+ });
+ $(delete_img).click( function () {
+ // Tag button is image's parent.
+ var tag_button = $(this).parent();
+
+ // Get tag name, value.
+ var tag_name_elt = tag_button.find(".tag-name").eq(0);
+ var tag_str = tag_name_elt.text();
+ var tag_name_and_value = tag_str.split(":");
+ var tag_name = tag_name_and_value[0];
+ var tag_value = tag_name_and_value[1];
+
+ var prev_button = tag_button.prev();
+ tag_button.remove();
+
+ // Remove tag from local list for consistency.
+ delete settings.tags[tag_name];
+
+ // Update toggle link text.
+ var new_text = settings.get_toggle_link_text_fn(settings.tags);
+ toggle_link.text(new_text);
+
+ // Delete tag.
+ $.ajax({
+ url: settings.ajax_delete_tag_url,
+ data: { tag_name: tag_name },
+ error: function() {
+ // Failed. Roll back changes and show alert.
+ settings.tags[tag_name] = tag_value;
+ if (prev_button.hasClass("tag-button")) {
+ prev_button.after(tag_button);
+ } else {
+ tag_area.prepend(tag_button);
+ }
+ alert( "Remove tag failed" );
+
+ toggle_link.text(settings.get_toggle_link_text_fn(settings.tags));
+
+ // TODO: no idea why it's necessary to set this up again.
+ delete_img.mouseenter( function () {
+ $(this).attr("src", settings.delete_tag_img_rollover);
+ });
+ delete_img.mouseleave( function () {
+ $(this).attr("src", settings.delete_tag_img);
+ });
+ },
+ success: function() {}
+ });
+
+ return true;
+ });
+ }
+
+ //
+ // Function that builds a tag button.
+ //
+ function build_tag_button(tag_str) {
+ // Build "delete tag" image.
+ var delete_img = $("<img/>").attr("src", settings.delete_tag_img).addClass("delete-tag-img");
+ init_delete_tag_image(delete_img);
+
+ // Build tag button.
+ var tag_name_elt = $("<span>").text(tag_str).addClass("tag-name");
+ tag_name_elt.click( function() {
+ var tag_name_and_value = tag_str.split(":");
+ settings.tag_click_fn(tag_name_and_value[0], tag_name_and_value[1]);
+ return true;
+ });
+
+ var tag_button = $("<span></span>").addClass("tag-button");
+ tag_button.append(tag_name_elt);
+ // Allow delete only if element is editable.
+ if (settings.editable) {
+ tag_button.append(delete_img);
+ }
+
+ return tag_button;
+ }
+
+};
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/galaxy/scripts/galaxy.base.js
--- /dev/null
+++ b/client/galaxy/scripts/galaxy.base.js
@@ -0,0 +1,658 @@
+// requestAnimationFrame polyfill
+(function() {
+ var lastTime = 0;
+ var vendors = ['ms', 'moz', 'webkit', 'o'];
+ for(var x = 0; x < vendors.length && !window.requestAnimationFrame; ++x) {
+ window.requestAnimationFrame = window[vendors[x]+'RequestAnimationFrame'];
+ window.cancelRequestAnimationFrame = window[vendors[x]+
+ 'CancelRequestAnimationFrame'];
+ }
+
+ if (!window.requestAnimationFrame)
+ window.requestAnimationFrame = function(callback, element) {
+ var currTime = new Date().getTime();
+ var timeToCall = Math.max(0, 16 - (currTime - lastTime));
+ var id = window.setTimeout(function() { callback(currTime + timeToCall); },
+ timeToCall);
+ lastTime = currTime + timeToCall;
+ return id;
+ };
+
+ if (!window.cancelAnimationFrame)
+ window.cancelAnimationFrame = function(id) {
+ clearTimeout(id);
+ };
+}());
+
+// IE doesn't implement Array.indexOf
+if (!Array.indexOf) {
+ Array.prototype.indexOf = function(obj) {
+ for (var i = 0, len = this.length; i < len; i++) {
+ if (this[i] == obj) {
+ return i;
+ }
+ }
+ return -1;
+ };
+}
+
+// Returns the number of keys (elements) in an array/dictionary.
+function obj_length(obj) {
+ if (obj.length !== undefined) {
+ return obj.length;
+ }
+
+ var count = 0;
+ for (var element in obj) {
+ count++;
+ }
+ return count;
+}
+
+$.fn.makeAbsolute = function(rebase) {
+ return this.each(function() {
+ var el = $(this);
+ var pos = el.position();
+ el.css({
+ position: "absolute",
+ marginLeft: 0, marginTop: 0,
+ top: pos.top, left: pos.left,
+ right: $(window).width() - ( pos.left + el.width() )
+ });
+ if (rebase) {
+ el.remove().appendTo("body");
+ }
+ });
+};
+
+/**
+ * Sets up popupmenu rendering and binds options functions to the appropriate links.
+ * initial_options is a dict with text describing the option pointing to either (a) a
+ * function to perform; or (b) another dict with two required keys, 'url' and 'action' (the
+ * function to perform. (b) is useful for exposing the underlying URL of the option.
+ */
+function make_popupmenu(button_element, initial_options) {
+ /* Use the $.data feature to store options with the link element.
+ This allows options to be changed at a later time
+ */
+ var element_menu_exists = (button_element.data("menu_options"));
+ button_element.data("menu_options", initial_options);
+
+ // If element already has menu, nothing else to do since HTML and actions are already set.
+ if (element_menu_exists) { return; }
+
+ button_element.bind("click.show_popup", function(e) {
+ // Close existing visible menus
+ $(".popmenu-wrapper").remove();
+
+ // Need setTimeouts so clicks don't interfere with each other
+ setTimeout( function() {
+ // Dynamically generate the wrapper holding all the selectable options of the menu.
+ var menu_element = $( "<ul class='dropdown-menu' id='" + button_element.attr('id') + "-menu'></ul>" );
+ var options = button_element.data("menu_options");
+ if (obj_length(options) <= 0) {
+ $("<li>No Options.</li>").appendTo(menu_element);
+ }
+ $.each( options, function( k, v ) {
+ if (v) {
+ // Action can be either an anonymous function and a mapped dict.
+ var action = v.action || v;
+ menu_element.append( $("<li></li>").append( $("<a>").attr("href", v.url).html(k).click(action) ) );
+ } else {
+ menu_element.append( $("<li></li>").addClass( "head" ).append( $("<a href='#'></a>").html(k) ) );
+ }
+ });
+ var wrapper = $( "<div class='popmenu-wrapper' style='position: absolute;left: 0; top: -1000;'></div>" )
+ .append( menu_element ).appendTo( "body" );
+
+ var x = e.pageX - wrapper.width() / 2 ;
+ x = Math.min( x, $(document).scrollLeft() + $(window).width() - $(wrapper).width() - 5 );
+ x = Math.max( x, $(document).scrollLeft() + 5 );
+
+ wrapper.css({
+ top: e.pageY,
+ left: x
+ });
+ }, 10);
+
+ setTimeout( function() {
+ // Bind click event to current window and all frames to remove any visible menus
+ // Bind to document object instead of window object for IE compat
+ var close_popup = function(el) {
+ $(el).bind("click.close_popup", function() {
+ $(".popmenu-wrapper").remove();
+ el.unbind("click.close_popup");
+ });
+ };
+ close_popup( $(window.document) ); // Current frame
+ close_popup( $(window.top.document) ); // Parent frame
+ for (var frame_id = window.top.frames.length; frame_id--;) { // Sibling frames
+ var frame = $(window.top.frames[frame_id].document);
+ close_popup(frame);
+ }
+ }, 50);
+
+ return false;
+ });
+
+}
+
+/**
+ * Convert two seperate (often adjacent) divs into galaxy popupmenu
+ * - div 1 contains a number of anchors which become the menu options
+ * - div 1 should have a 'popupmenu' attribute
+ * - this popupmenu attribute contains the id of div 2
+ * - div 2 becomes the 'face' of the popupmenu
+ *
+ * NOTE: make_popup_menus finds and operates on all divs with a popupmenu attr (no need to point it at something)
+ * but (since that selector searches the dom on the page), you can send a parent in
+ * NOTE: make_popup_menus, and make_popupmenu are horrible names
+ */
+function make_popup_menus( parent ) {
+ // find all popupmenu menu divs (divs that contains anchors to be converted to menu options)
+ // either in the parent or the document if no parent passed
+ parent = parent || document;
+ $( parent ).find( "div[popupmenu]" ).each( function() {
+ var options = {};
+ var menu = $(this);
+
+ // find each anchor in the menu, convert them into an options map: { a.text : click_function }
+ menu.find( "a" ).each( function() {
+ var link = $(this),
+ link_dom = link.get(0),
+ confirmtext = link_dom.getAttribute( "confirm" ),
+ href = link_dom.getAttribute( "href" ),
+ target = link_dom.getAttribute( "target" );
+
+ // no href - no function (gen. a label)
+ if (!href) {
+ options[ link.text() ] = null;
+
+ } else {
+ options[ link.text() ] = {
+ url: href,
+ action: function() {
+
+ // if theres confirm text, send the dialog
+ if ( !confirmtext || confirm( confirmtext ) ) {
+ // link.click() doesn't use target for some reason,
+ // so manually do it here.
+ if (target) {
+ window.open(href, target);
+ return false;
+ }
+ // For all other links, do the default action.
+ else {
+ link.click();
+ }
+ }
+ }
+ };
+ }
+ });
+ // locate the element with the id corresponding to the menu's popupmenu attr
+ var box = $( parent ).find( "#" + menu.attr( 'popupmenu' ) );
+
+ // For menus with clickable link text, make clicking on the link go through instead
+ // of activating the popup menu
+ box.find("a").bind("click", function(e) {
+ e.stopPropagation(); // Stop bubbling so clicking on the link goes through
+ return true;
+ });
+
+ // attach the click events and menu box building to the box element
+ make_popupmenu(box, options);
+ box.addClass("popup");
+ menu.remove();
+ });
+}
+
+// Alphanumeric/natural sort fn
+function naturalSort(a, b) {
+ // setup temp-scope variables for comparison evauluation
+ var re = /(-?[0-9\.]+)/g,
+ x = a.toString().toLowerCase() || '',
+ y = b.toString().toLowerCase() || '',
+ nC = String.fromCharCode(0),
+ xN = x.replace( re, nC + '$1' + nC ).split(nC),
+ yN = y.replace( re, nC + '$1' + nC ).split(nC),
+ xD = (new Date(x)).getTime(),
+ yD = xD ? (new Date(y)).getTime() : null;
+ // natural sorting of dates
+ if ( yD ) {
+ if ( xD < yD ) { return -1; }
+ else if ( xD > yD ) { return 1; }
+ }
+ // natural sorting through split numeric strings and default strings
+ var oFxNcL, oFyNcL;
+ for ( var cLoc = 0, numS = Math.max(xN.length, yN.length); cLoc < numS; cLoc++ ) {
+ oFxNcL = parseFloat(xN[cLoc]) || xN[cLoc];
+ oFyNcL = parseFloat(yN[cLoc]) || yN[cLoc];
+ if (oFxNcL < oFyNcL) { return -1; }
+ else if (oFxNcL > oFyNcL) { return 1; }
+ }
+ return 0;
+}
+
+$.fn.refresh_select2 = function() {
+ var select_elt = $(this);
+ var options = { placeholder:'Click to select',
+ closeOnSelect: !select_elt.is("[MULTIPLE]"),
+ dropdownAutoWidth : true,
+ containerCssClass: 'select2-minwidth'
+ };
+ return select_elt.select2( options );
+}
+
+// Replace select box with a text input box + autocomplete.
+function replace_big_select_inputs(min_length, max_length, select_elts) {
+ // To do replace, the select2 plugin must be loaded.
+
+ if (!jQuery.fn.select2) {
+ return;
+ }
+
+ // Set default for min_length and max_length
+ if (min_length === undefined) {
+ min_length = 20;
+ }
+ if (max_length === undefined) {
+ max_length = 3000;
+ }
+
+ select_elts = select_elts || $('select');
+
+ select_elts.each( function() {
+ var select_elt = $(this).not('[multiple]');
+ // Make sure that options is within range.
+ var num_options = select_elt.find('option').length;
+ if ( (num_options < min_length) || (num_options > max_length) ) {
+ return;
+ }
+
+ if (select_elt.hasClass("no-autocomplete")) {
+ return;
+ }
+
+ /* Replaced jQuery.autocomplete with select2, notes:
+ * - multiple selects are supported
+ * - the original element is updated with the value, convert_to_values should not be needed
+ * - events are fired when updating the original element, so refresh_on_change should just work
+ *
+ * - should we still sort dbkey fields here?
+ */
+ select_elt.refresh_select2();
+ });
+}
+
+/**
+ * Make an element with text editable: (a) when user clicks on text, a textbox/area
+ * is provided for editing; (b) when enter key pressed, element's text is set and on_finish
+ * is called.
+ */
+// TODO: use this function to implement async_save_text (implemented below).
+$.fn.make_text_editable = function(config_dict) {
+ // Get config options.
+ var num_cols = ("num_cols" in config_dict ? config_dict.num_cols : 30),
+ num_rows = ("num_rows" in config_dict ? config_dict.num_rows : 4),
+ use_textarea = ("use_textarea" in config_dict ? config_dict.use_textarea : false),
+ on_finish = ("on_finish" in config_dict ? config_dict.on_finish : null),
+ help_text = ("help_text" in config_dict ? config_dict.help_text : null);
+
+ // Add element behavior.
+ var container = $(this);
+ container.addClass("editable-text").click(function(e) {
+ // If there's already an input element, editing is active, so do nothing.
+ if ($(this).children(":input").length > 0) {
+ return;
+ }
+
+ container.removeClass("editable-text");
+
+ // Handler for setting element text.
+ var set_text = function(new_text) {
+ container.find(":input").remove();
+
+ if (new_text !== "") {
+ container.text(new_text);
+ }
+ else {
+ // No text; need a line so that there is a click target.
+ container.html("<br>");
+ }
+ container.addClass("editable-text");
+
+ if (on_finish) {
+ on_finish(new_text);
+ }
+ };
+
+ // Create input element(s) for editing.
+ var cur_text = ("cur_text" in config_dict ? config_dict.cur_text : container.text() ),
+ input_elt, button_elt;
+
+ if (use_textarea) {
+ input_elt = $("<textarea/>")
+ .attr({ rows: num_rows, cols: num_cols }).text($.trim(cur_text))
+ .keyup(function(e) {
+ if (e.keyCode === 27) {
+ // Escape key.
+ set_text(cur_text);
+ }
+ });
+ button_elt = $("<button/>").text("Done").click(function() {
+ set_text(input_elt.val());
+ // Return false so that click does not propogate to container.
+ return false;
+ });
+ }
+ else {
+ input_elt = $("<input type='text'/>").attr({ value: $.trim(cur_text), size: num_cols })
+ .blur(function() {
+ set_text(cur_text);
+ }).keyup(function(e) {
+ if (e.keyCode === 27) {
+ // Escape key.
+ $(this).trigger("blur");
+ } else if (e.keyCode === 13) {
+ // Enter key.
+ set_text($(this).val());
+ }
+
+ // Do not propogate event to avoid unwanted side effects.
+ e.stopPropagation();
+ });
+ }
+
+ // Replace text with input object(s) and focus & select.
+ container.text("");
+ container.append(input_elt);
+ if (button_elt) {
+ container.append(button_elt);
+ }
+ input_elt.focus();
+ input_elt.select();
+
+ // Do not propogate to elements below b/c that blurs input and prevents it from being used.
+ e.stopPropagation();
+ });
+
+ // Add help text if there some.
+ if (help_text) {
+ container.attr("title", help_text).tooltip();
+ }
+
+ return container;
+};
+
+/**
+ * Edit and save text asynchronously.
+ */
+function async_save_text( click_to_edit_elt, text_elt_id, save_url,
+ text_parm_name, num_cols, use_textarea, num_rows, on_start, on_finish ) {
+ // Set defaults if necessary.
+ if (num_cols === undefined) {
+ num_cols = 30;
+ }
+ if (num_rows === undefined) {
+ num_rows = 4;
+ }
+
+ // Set up input element.
+ $("#" + click_to_edit_elt).click(function() {
+ // Check if this is already active
+ if ( $("#renaming-active").length > 0) {
+ return;
+ }
+ var text_elt = $("#" + text_elt_id),
+ old_text = text_elt.text(),
+ t;
+
+ if (use_textarea) {
+ t = $("<textarea></textarea>").attr({ rows: num_rows, cols: num_cols }).text( $.trim(old_text) );
+ } else {
+ t = $("<input type='text'></input>").attr({ value: $.trim(old_text), size: num_cols });
+ }
+ t.attr("id", "renaming-active");
+ t.blur( function() {
+ $(this).remove();
+ text_elt.show();
+ if (on_finish) {
+ on_finish(t);
+ }
+ });
+ t.keyup( function( e ) {
+ if ( e.keyCode === 27 ) {
+ // Escape key
+ $(this).trigger( "blur" );
+ } else if ( e.keyCode === 13 ) {
+ // Enter key submits
+ var ajax_data = {};
+ ajax_data[text_parm_name] = $(this).val();
+ $(this).trigger( "blur" );
+ $.ajax({
+ url: save_url,
+ data: ajax_data,
+ error: function() {
+ alert( "Text editing for elt " + text_elt_id + " failed" );
+ // TODO: call finish or no? For now, let's not because error occurred.
+ },
+ success: function(processed_text) {
+ // Set new text and call finish method.
+ if (processed_text !== "") {
+ text_elt.text(processed_text);
+ } else {
+ text_elt.html("<em>None</em>");
+ }
+ if (on_finish) {
+ on_finish(t);
+ }
+ }
+ });
+ }
+ });
+
+ if (on_start) {
+ on_start(t);
+ }
+ // Replace text with input object and focus & select.
+ text_elt.hide();
+ t.insertAfter(text_elt);
+ t.focus();
+ t.select();
+
+ return;
+ });
+}
+
+function commatize( number ) {
+ number += ''; // Convert to string
+ var rgx = /(\d+)(\d{3})/;
+ while (rgx.test(number)) {
+ number = number.replace(rgx, '$1' + ',' + '$2');
+ }
+ return number;
+}
+
+// Reset tool search to start state.
+function reset_tool_search( initValue ) {
+ // Function may be called in top frame or in tool_menu_frame;
+ // in either case, get the tool menu frame.
+ var tool_menu_frame = $("#galaxy_tools").contents();
+ if (tool_menu_frame.length === 0) {
+ tool_menu_frame = $(document);
+ }
+
+ // Remove classes that indicate searching is active.
+ $(this).removeClass("search_active");
+ tool_menu_frame.find(".toolTitle").removeClass("search_match");
+
+ // Reset visibility of tools and labels.
+ tool_menu_frame.find(".toolSectionBody").hide();
+ tool_menu_frame.find(".toolTitle").show();
+ tool_menu_frame.find(".toolPanelLabel").show();
+ tool_menu_frame.find(".toolSectionWrapper").each( function() {
+ if ($(this).attr('id') !== 'recently_used_wrapper') {
+ // Default action.
+ $(this).show();
+ } else if ($(this).hasClass("user_pref_visible")) {
+ $(this).show();
+ }
+ });
+ tool_menu_frame.find("#search-no-results").hide();
+
+ // Reset search input.
+ tool_menu_frame.find("#search-spinner").hide();
+ if (initValue) {
+ var search_input = tool_menu_frame.find("#tool-search-query");
+ search_input.val("search tools");
+ }
+}
+
+// Create GalaxyAsync object.
+var GalaxyAsync = function(log_action) {
+ this.url_dict = {};
+ this.log_action = (log_action === undefined ? false : log_action);
+};
+
+GalaxyAsync.prototype.set_func_url = function( func_name, url ) {
+ this.url_dict[func_name] = url;
+};
+
+// Set user preference asynchronously.
+GalaxyAsync.prototype.set_user_pref = function( pref_name, pref_value ) {
+ // Get URL.
+ var url = this.url_dict[arguments.callee];
+ if (url === undefined) { return false; }
+ $.ajax({
+ url: url,
+ data: { "pref_name" : pref_name, "pref_value" : pref_value },
+ error: function() { return false; },
+ success: function() { return true; }
+ });
+};
+
+// Log user action asynchronously.
+GalaxyAsync.prototype.log_user_action = function( action, context, params ) {
+ if (!this.log_action) { return; }
+
+ // Get URL.
+ var url = this.url_dict[arguments.callee];
+ if (url === undefined) { return false; }
+ $.ajax({
+ url: url,
+ data: { "action" : action, "context" : context, "params" : params },
+ error: function() { return false; },
+ success: function() { return true; }
+ });
+};
+
+// Initialize refresh events.
+function init_refresh_on_change () {
+ $("select[refresh_on_change='true']")
+ .off('change')
+ .change(function() {
+ var select_field = $(this),
+ select_val = select_field.val(),
+ refresh = false,
+ ref_on_change_vals = select_field.attr("refresh_on_change_values");
+ if (ref_on_change_vals) {
+ ref_on_change_vals = ref_on_change_vals.split(',');
+ var last_selected_value = select_field.attr("last_selected_value");
+ if ($.inArray(select_val, ref_on_change_vals) === -1 && $.inArray(last_selected_value, ref_on_change_vals) === -1) {
+ return;
+ }
+ }
+ $(window).trigger("refresh_on_change");
+ $(document).trigger("convert_to_values"); // Convert autocomplete text to values
+ select_field.get(0).form.submit();
+ });
+
+ // checkboxes refresh on change
+ $(":checkbox[refresh_on_change='true']")
+ .off('click')
+ .click( function() {
+ var select_field = $(this),
+ select_val = select_field.val(),
+ refresh = false,
+ ref_on_change_vals = select_field.attr("refresh_on_change_values");
+ if (ref_on_change_vals) {
+ ref_on_change_vals = ref_on_change_vals.split(',');
+ var last_selected_value = select_field.attr("last_selected_value");
+ if ($.inArray(select_val, ref_on_change_vals) === -1 && $.inArray(last_selected_value, ref_on_change_vals) === -1) {
+ return;
+ }
+ }
+ $(window).trigger("refresh_on_change");
+ select_field.get(0).form.submit();
+ });
+
+ // Links with confirmation
+ $( "a[confirm]" )
+ .off('click')
+ .click( function() {
+ return confirm( $(this).attr("confirm") );
+ });
+};
+
+
+// jQuery plugin to prevent double submission of forms
+// Ref: http://stackoverflow.com/questions/2830542/prevent-double-submission-of-for…
+jQuery.fn.preventDoubleSubmission = function() {
+ $(this).on('submit',function(e){
+ var $form = $(this);
+
+ if ($form.data('submitted') === true) {
+ // Previously submitted - don't submit again
+ e.preventDefault();
+ } else {
+ // Mark it so that the next submit can be ignored
+ $form.data('submitted', true);
+ }
+ });
+
+ // Keep chainability
+ return this;
+};
+
+$(document).ready( function() {
+
+ // Refresh events for form fields.
+ init_refresh_on_change();
+
+ // Tooltips
+ if ( $.fn.tooltip ) {
+ // Put tooltips below items in panel header so that they do not overlap masthead.
+ $(".unified-panel-header [title]").tooltip( { placement: 'bottom' } );
+
+ // default tooltip initialization, it will follow the data-placement tag for tooltip location
+ // and fallback to 'top' if not present
+ $("[title]").tooltip();
+ }
+ // Make popup menus.
+ make_popup_menus();
+
+ // Replace big selects.
+ replace_big_select_inputs(20, 1500);
+
+ // If galaxy_main frame does not exist and link targets galaxy_main,
+ // add use_panels=True and set target to self.
+ $("a").click( function() {
+ var anchor = $(this);
+ var galaxy_main_exists = (parent.frames && parent.frames.galaxy_main);
+ if ( ( anchor.attr( "target" ) == "galaxy_main" ) && ( !galaxy_main_exists ) ) {
+ var href = anchor.attr("href");
+ if (href.indexOf("?") == -1) {
+ href += "?";
+ }
+ else {
+ href += "&";
+ }
+ href += "use_panels=True";
+ anchor.attr("href", href);
+ anchor.attr("target", "_self");
+ }
+ return anchor;
+ });
+
+});
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/galaxy/scripts/galaxy.frame.js
--- /dev/null
+++ b/client/galaxy/scripts/galaxy.frame.js
@@ -0,0 +1,224 @@
+// dependencies
+define(["galaxy.masthead", "mvc/ui/ui-frames"], function(mod_masthead, Frames) {
+
+// frame manager
+var GalaxyFrame = Backbone.View.extend(
+{
+ // base element
+ el_main: 'body',
+
+ // frame active/disabled
+ active: false,
+
+ // button active
+ button_active: null,
+
+ // button load
+ button_load : null,
+
+ // initialize
+ initialize : function(options)
+ {
+ // add to masthead menu
+ var self = this;
+
+ // create frames
+ this.frames = new Frames.View({
+ visible: false,
+ });
+
+ // add activate icon
+ this.button_active = new mod_masthead.GalaxyMastheadIcon (
+ {
+ icon : 'fa-th',
+ tooltip : 'Enable/Disable Scratchbook',
+ onclick : function() { self._activate(); },
+ onunload : function() {
+ if (self.frames.length() > 0) {
+ return "You opened " + self.frames.length() + " frame(s) which will be lost.";
+ }
+ }
+ });
+
+ // add to masthead
+ Galaxy.masthead.append(this.button_active);
+
+ // add load icon
+ this.button_load = new mod_masthead.GalaxyMastheadIcon (
+ {
+ icon : 'fa-eye',
+ tooltip : 'Show/Hide Scratchbook',
+ onclick : function(e) {
+ if (self.frames.visible) {
+ self.frames.hide();
+ } else {
+ self.frames.show();
+ }
+ },
+ with_number : true
+ });
+
+ // add to masthead
+ Galaxy.masthead.append(this.button_load);
+
+ // create
+ this.setElement(this.frames.$el);
+
+ // append to main
+ $(this.el_main).append(this.$el);
+
+ // refresh menu
+ this.frames.setOnChange(function() {
+ self._refresh();
+ });
+ this._refresh();
+ },
+
+ /**
+ * Add a dataset to the frames.
+ */
+ add_dataset: function(dataset_id) {
+ var self = this;
+ require(['mvc/data'], function(DATA) {
+ var dataset = new DATA.Dataset({ id: dataset_id });
+ $.when( dataset.fetch() ).then( function() {
+ // Construct frame config based on dataset's type.
+ var frame_config = {
+ title: dataset.get('name')
+ },
+ // HACK: For now, assume 'tabular' and 'interval' are the only
+ // modules that contain tabular files. This needs to be replaced
+ // will a is_datatype() function.
+ is_tabular = _.find(['tabular', 'interval'], function(data_type) {
+ return dataset.get('data_type').indexOf(data_type) !== -1;
+ });
+
+ // Use tabular chunked display if dataset is tabular; otherwise load via URL.
+ if (is_tabular) {
+ var tabular_dataset = new DATA.TabularDataset(dataset.toJSON());
+ _.extend(frame_config, {
+ type: 'other',
+ content: function( parent_elt ) {
+ DATA.createTabularDatasetChunkedView({
+ model: tabular_dataset,
+ parent_elt: parent_elt,
+ embedded: true,
+ height: '100%'
+ });
+ }
+ });
+ }
+ else {
+ _.extend(frame_config, {
+ type: 'url',
+ content: galaxy_config.root + 'datasets/' +
+ dataset.id + '/display/?preview=True'
+ });
+ }
+
+ self.add(frame_config);
+
+ });
+ });
+
+ },
+
+ /**
+ * Add and display a new frame/window based on options.
+ */
+ add: function(options)
+ {
+ // open new tab
+ if (options.target == '_blank')
+ {
+ window.open(options.content);
+ return;
+ }
+
+ // reload entire window
+ if (options.target == '_top' || options.target == '_parent' || options.target == '_self')
+ {
+ window.location = options.content;
+ return;
+ }
+
+ // validate
+ if (!this.active)
+ {
+ // fix url if main frame is unavailable
+ var $galaxy_main = $(window.parent.document).find('#galaxy_main');
+ if (options.target == 'galaxy_main' || options.target == 'center')
+ {
+ if ($galaxy_main.length === 0)
+ {
+ var href = options.content;
+ if (href.indexOf('?') == -1)
+ href += '?';
+ else
+ href += '&';
+ href += 'use_panels=True';
+ window.location = href;
+ } else {
+ $galaxy_main.attr('src', options.content);
+ }
+ } else
+ window.location = options.content;
+
+ // stop
+ return;
+ }
+
+ // add to frames view
+ this.frames.add(options);
+ },
+
+ // activate/disable panel
+ _activate: function ()
+ {
+ // check
+ if (this.active)
+ {
+ // disable
+ this.active = false;
+
+ // toggle
+ this.button_active.untoggle();
+
+ // hide panel
+ this.frames.hide();
+ } else {
+ // activate
+ this.active = true;
+
+ // untoggle
+ this.button_active.toggle();
+ }
+ },
+
+ // update frame counter
+ _refresh: function()
+ {
+ // update on screen counter
+ this.button_load.number(this.frames.length());
+
+ // check
+ if(this.frames.length() === 0)
+ this.button_load.hide();
+ else
+ this.button_load.show();
+
+ // check
+ if (this.frames.visible) {
+ this.button_load.toggle();
+ } else {
+ this.button_load.untoggle();
+ }
+ }
+});
+
+// return
+return {
+ GalaxyFrame: GalaxyFrame
+};
+
+});
diff -r 007f6a80629a74650b72d67821a9505932d284f3 -r 2092948937ac30ef82f71463a235c66d34987088 client/galaxy/scripts/galaxy.library.js
--- /dev/null
+++ b/client/galaxy/scripts/galaxy.library.js
@@ -0,0 +1,169 @@
+// MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
+// === MAIN GALAXY LIBRARY MODULE ====
+// MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
+
+define([
+ "galaxy.masthead",
+ "utils/utils",
+ "libs/toastr",
+ "mvc/base-mvc",
+ "mvc/library/library-model",
+ "mvc/library/library-folderlist-view",
+ "mvc/library/library-librarylist-view",
+ "mvc/library/library-librarytoolbar-view",
+ "mvc/library/library-foldertoolbar-view",
+ "mvc/library/library-dataset-view",
+ "mvc/library/library-library-view",
+ "mvc/library/library-folder-view"
+ ],
+function(mod_masthead,
+ mod_utils,
+ mod_toastr,
+ mod_baseMVC,
+ mod_library_model,
+ mod_folderlist_view,
+ mod_librarylist_view,
+ mod_librarytoolbar_view,
+ mod_foldertoolbar_view,
+ mod_library_dataset_view,
+ mod_library_library_view,
+ mod_library_folder_view
+ ) {
+
+// ============================================================================
+// ROUTER
+var LibraryRouter = Backbone.Router.extend({
+ initialize: function() {
+ this.routesHit = 0;
+ //keep count of number of routes handled by the application
+ Backbone.history.on('route', function() { this.routesHit++; }, this);
+ },
+
+ routes: {
+ "" : "libraries",
+ "library/:library_id/permissions" : "library_permissions",
+ "folders/:folder_id/permissions" : "folder_permissions",
+ "folders/:id" : "folder_content",
+ "folders/:folder_id/datasets/:dataset_id" : "dataset_detail",
+ "folders/:folder_id/datasets/:dataset_id/permissions" : "dataset_permissions",
+ "folders/:folder_id/datasets/:dataset_id/versions/:ldda_id" : "dataset_version",
+ "folders/:folder_id/download/:format" : "download",
+ "folders/:folder_id/import/:source" : "import_datasets"
+ },
+
+ back: function() {
+ if(this.routesHit > 1) {
+ //more than one route hit -> user did not land to current page directly
+ window.history.back();
+ } else {
+ //otherwise go to the home page. Use replaceState if available so
+ //the navigation doesn't create an extra history entry
+ this.navigate('#', {trigger:true, replace:true});
+ }
+ }
+});
+
+// ============================================================================
+/** session storage for library preferences */
+var LibraryPrefs = mod_baseMVC.SessionStorageModel.extend({
+ defaults : {
+ with_deleted : false,
+ sort_order : 'asc',
+ sort_by : 'name'
+ }
+});
+
+// ============================================================================
+// Main controller of Galaxy Library
+var GalaxyLibrary = Backbone.View.extend({
+
+ libraryToolbarView: null,
+ libraryListView: null,
+ library_router: null,
+ libraryView: null,
+ folderToolbarView: null,
+ folderListView: null,
+ datasetView: null,
+
+ initialize : function(){
+ Galaxy.libraries = this;
+
+ this.preferences = new LibraryPrefs( {id: 'global-lib-prefs'} );
+
+ this.library_router = new LibraryRouter();
+
+ this.library_router.on('route:libraries', function() {
+ Galaxy.libraries.libraryToolbarView = new mod_librarytoolbar_view.LibraryToolbarView();
+ Galaxy.libraries.libraryListView = new mod_librarylist_view.LibraryListView();
+ });
+
+ this.library_router.on('route:folder_content', function(id) {
+ if (Galaxy.libraries.folderToolbarView){
+ Galaxy.libraries.folderToolbarView.$el.unbind('click');
+ }
+ Galaxy.libraries.folderToolbarView = new mod_foldertoolbar_view.FolderToolbarView({id: id});
+ Galaxy.libraries.folderListView = new mod_folderlist_view.FolderListView({id: id});
+ });
+
+ this.library_router.on('route:download', function(folder_id, format) {
+ if ($('#folder_list_body').find(':checked').length === 0) {
+ mod_toastr.info( 'You must select at least one dataset to download' );
+ Galaxy.libraries.library_router.navigate('folders/' + folder_id, {trigger: true, replace: true});
+ } else {
+ Galaxy.libraries.folderToolbarView.download(folder_id, format);
+ Galaxy.libraries.library_router.navigate('folders/' + folder_id, {trigger: false, replace: true});
+ }
+ });
+
+ this.library_router.on('route:dataset_detail', function(folder_id, dataset_id){
+ if (Galaxy.libraries.datasetView){
+ Galaxy.libraries.datasetView.$el.unbind('click');
+ }
+ Galaxy.libraries.datasetView = new mod_library_dataset_view.LibraryDatasetView({id: dataset_id});
+ });
+ this.library_router.on('route:dataset_version', function(folder_id, dataset_id, ldda_id){
+ if (Galaxy.libraries.datasetView){
+ Galaxy.libraries.datasetView.$el.unbind('click');
+ }
+ Galaxy.libraries.datasetView = new mod_library_dataset_view.LibraryDatasetView({id: dataset_id, ldda_id: ldda_id, show_version: true});
+ });
+
+ this.library_router.on('route:dataset_permissions', function(folder_id, dataset_id){
+ if (Galaxy.libraries.datasetView){
+ Galaxy.libraries.datasetView.$el.unbind('click');
+ }
+ Galaxy.libraries.datasetView = new mod_library_dataset_view.LibraryDatasetView({id: dataset_id, show_permissions: true});
+ });
+
+ this.library_router.on('route:library_permissions', function(library_id){
+ if (Galaxy.libraries.libraryView){
+ Galaxy.libraries.libraryView.$el.unbind('click');
+ }
+ Galaxy.libraries.libraryView = new mod_library_library_view.LibraryView({id: library_id, show_permissions: true});
+ });
+
+ this.library_router.on('route:folder_permissions', function(folder_id){
+ if (Galaxy.libraries.folderView){
+ Galaxy.libraries.folderView.$el.unbind('click');
+ }
+ Galaxy.libraries.folderView = new mod_library_folder_view.FolderView({id: folder_id, show_permissions: true});
+ });
+ this.library_router.on('route:import_datasets', function(folder_id, source){
+ if (Galaxy.libraries.folderToolbarView && Galaxy.libraries.folderListView){
+ Galaxy.libraries.folderToolbarView.showImportModal({source:source});
+ } else {
+ Galaxy.libraries.folderToolbarView = new mod_foldertoolbar_view.FolderToolbarView({id: folder_id});
+ Galaxy.libraries.folderListView = new mod_folderlist_view.FolderListView({id: folder_id});
+ Galaxy.libraries.folderToolbarView.showImportModal({source: source});
+ }
+ });
+
+ Backbone.history.start({pushState: false});
+ }
+});
+
+return {
+ GalaxyApp: GalaxyLibrary
+};
+
+});
This diff is so big that we needed to truncate the remainder.
https://bitbucket.org/galaxy/galaxy-central/commits/3b3cd242b4b7/
Changeset: 3b3cd242b4b7
Branch: stable
User: natefoo
Date: 2014-10-06 16:59:17+00:00
Summary: Added tag release_2014.10.06 for changeset 2092948937ac
Affected #: 1 file
diff -r 2092948937ac30ef82f71463a235c66d34987088 -r 3b3cd242b4b7cd20b9c868c393c455524b31b87c .hgtags
--- a/.hgtags
+++ b/.hgtags
@@ -19,3 +19,4 @@
2a756ca2cb1826db7796018e77d12e2dd7b67603 latest_2014.02.10
ca45b78adb4152fc6e7395514d46eba6b7d0b838 release_2014.08.11
548ab24667d6206780237bd807f7d857a484c461 latest_2014.08.11
+2092948937ac30ef82f71463a235c66d34987088 release_2014.10.06
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
[galaxyproject/usegalaxy-playbook] 4e4247: Add rodeo destination even though it doesn't work.
by GitHub 06 Oct '14
by GitHub 06 Oct '14
06 Oct '14
Branch: refs/heads/master
Home: https://github.com/galaxyproject/usegalaxy-playbook
Commit: 4e4247588aaf520f449dc923029080a9eb4f2f3c
https://github.com/galaxyproject/usegalaxy-playbook/commit/4e4247588aaf520f…
Author: Nate Coraor <nate(a)bx.psu.edu>
Date: 2014-10-06 (Mon, 06 Oct 2014)
Changed paths:
M templates/galaxy/test.galaxyproject.org/config/job_conf.xml.j2
Log Message:
-----------
Add rodeo destination even though it doesn't work.
Commit: c0f18b86fd5c6af84777421b0cc8db13fec1b7c2
https://github.com/galaxyproject/usegalaxy-playbook/commit/c0f18b86fd5c6af8…
Author: Nate Coraor <nate(a)bx.psu.edu>
Date: 2014-10-06 (Mon, 06 Oct 2014)
Changed paths:
M stage/group_vars/all.yml
Log Message:
-----------
Update Test.
Compare: https://github.com/galaxyproject/usegalaxy-playbook/compare/c3d57de7f661...…
1
0
commit/galaxy-central: dan: Remove deprecation warnings from util.json methods.
by commits-noreply@bitbucket.org 06 Oct '14
by commits-noreply@bitbucket.org 06 Oct '14
06 Oct '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/8f372abd1912/
Changeset: 8f372abd1912
Branch: next-stable
User: dan
Date: 2014-10-06 15:49:09+00:00
Summary: Remove deprecation warnings from util.json methods.
Affected #: 1 file
diff -r 054d8af08f098c76b9b64eb5d60835956abc8ae1 -r 8f372abd19124a95fa79f0eeb7da497dc88682f5 lib/galaxy/util/json.py
--- a/lib/galaxy/util/json.py
+++ b/lib/galaxy/util/json.py
@@ -17,12 +17,10 @@
def to_json_string(*args, **kwargs):
- log.warning("Using deprecated function to_json_string.")
return json.dumps(*args, **kwargs)
def from_json_string(*args, **kwargs):
- log.warning("Using deprecated function from_json_string.")
return json.loads(*args, **kwargs)
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Remove deprecation warnings from util.json methods.
by commits-noreply@bitbucket.org 06 Oct '14
by commits-noreply@bitbucket.org 06 Oct '14
06 Oct '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/594b48fe90b7/
Changeset: 594b48fe90b7
User: dan
Date: 2014-10-06 15:49:09+00:00
Summary: Remove deprecation warnings from util.json methods.
Affected #: 1 file
diff -r ae1a6cb7f4aae2ee7746fdee01d83d96cff5a0a2 -r 594b48fe90b7b17ed22f5c597837022d8204db47 lib/galaxy/util/json.py
--- a/lib/galaxy/util/json.py
+++ b/lib/galaxy/util/json.py
@@ -17,12 +17,10 @@
def to_json_string(*args, **kwargs):
- log.warning("Using deprecated function to_json_string.")
return json.dumps(*args, **kwargs)
def from_json_string(*args, **kwargs):
- log.warning("Using deprecated function from_json_string.")
return json.loads(*args, **kwargs)
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jmchilton: Eliminate Galaxy dependencies from lib/galaxy/tools/loader.py.
by commits-noreply@bitbucket.org 06 Oct '14
by commits-noreply@bitbucket.org 06 Oct '14
06 Oct '14
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/ae1a6cb7f4aa/
Changeset: ae1a6cb7f4aa
User: jmchilton
Date: 2014-10-06 13:24:53+00:00
Summary: Eliminate Galaxy dependencies from lib/galaxy/tools/loader.py.
Should allow for reuse in command-line tools that operate on Galaxy tools. Updates for targetting Python 2.6+.
Affected #: 1 file
diff -r 8f746975db66785dcec2ef16f943052d767b01d4 -r ae1a6cb7f4aae2ee7746fdee01d83d96cff5a0a2 lib/galaxy/tools/loader.py
--- a/lib/galaxy/tools/loader.py
+++ b/lib/galaxy/tools/loader.py
@@ -1,10 +1,8 @@
-from __future__ import with_statement
+from xml.etree import ElementTree, ElementInclude
from copy import deepcopy
import os
-from galaxy.util import parse_xml
-
def load_tool(path):
"""
@@ -16,7 +14,7 @@
_import_macros(root, path)
# Expand xml macros
- macro_dict = _macros_of_type(root, 'xml', lambda el: list(el.getchildren()))
+ macro_dict = _macros_of_type(root, 'xml', lambda el: list(el))
_expand_macros([root], macro_dict)
# Expand tokens
@@ -42,7 +40,7 @@
""" Load raw (no macro expansion) tree representation of tool represented
at the specified path.
"""
- tree = parse_xml(path)
+ tree = _parse_xml(path)
return tree
@@ -54,7 +52,7 @@
def _import_macros(root, path):
tool_dir = os.path.dirname(path)
macros_el = _macros_el(root)
- if macros_el:
+ if macros_el is not None:
macro_els = _load_macros(macros_el, tool_dir)
_xml_set_children(macros_el, macro_els)
@@ -66,7 +64,7 @@
def _macros_of_type(root, type, el_func):
macros_el = root.find('macros')
macro_dict = {}
- if macros_el:
+ if macros_el is not None:
macro_els = macros_el.findall('macro')
macro_dict = dict([(macro_el.get("name"), el_func(macro_el)) \
for macro_el in macro_els \
@@ -88,7 +86,7 @@
new_value = _expand_tokens_str(value, tokens)
if not (new_value is value):
element.attrib[key] = new_value
- _expand_tokens(list(element.getchildren()), tokens)
+ _expand_tokens(list(element), tokens)
def _expand_tokens_str(str, tokens):
@@ -129,7 +127,7 @@
def _expand_yield_statements(macro_def, expand_el):
yield_els = [yield_el for macro_def_el in macro_def for yield_el in macro_def_el.findall('.//yield')]
- expand_el_children = expand_el.getchildren()
+ expand_el_children = list(expand_el)
macro_def_parent_map = \
dict((c, p) for macro_def_el in macro_def for p in macro_def_el.getiterator() for c in p)
@@ -151,7 +149,7 @@
macro_els = []
# attribute typed macro
- if macros_el:
+ if macros_el is not None:
macro_els = macros_el.findall("macro")
for macro in macro_els:
if 'type' not in macro.attrib:
@@ -163,7 +161,7 @@
typed_tag = ['template', 'xml', 'token']
for tag in typed_tag:
macro_els = []
- if macros_el:
+ if macros_el is not None:
macro_els = macros_el.findall(tag)
for macro_el in macro_els:
macro_el.attrib['type'] = tag
@@ -188,7 +186,7 @@
def _imported_macro_paths_from_el(macros_el):
imported_macro_paths = []
macro_import_els = []
- if macros_el:
+ if macros_el is not None:
macro_import_els = macros_el.findall("import")
for macro_import_el in macro_import_els:
raw_import_path = macro_import_el.text
@@ -199,13 +197,13 @@
def _load_macro_file(path, tool_dir):
- tree = parse_xml(path)
+ tree = _parse_xml(path)
root = tree.getroot()
return _load_macros(root, tool_dir)
def _xml_set_children(element, new_children):
- for old_child in element.getchildren():
+ for old_child in element:
element.remove(old_child)
for i, new_child in enumerate(new_children):
element.insert(i, new_child)
@@ -216,7 +214,7 @@
parent_el = parent_map[query]
matching_index = -1
#for index, el in enumerate(parent_el.iter('.')): ## Something like this for newer implementation
- for index, el in enumerate(parent_el.getchildren()):
+ for index, el in enumerate(list(parent_el)):
if el == query:
matching_index = index
break
@@ -226,3 +224,10 @@
current_index += 1
parent_el.insert(current_index, deepcopy(target))
parent_el.remove(query)
+
+
+def _parse_xml(fname):
+ tree = ElementTree.parse(fname)
+ root = tree.getroot()
+ ElementInclude.include(root)
+ return tree
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0