galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
August 2013
- 1 participants
- 149 discussions
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/3615504d0a43/
Changeset: 3615504d0a43
Branch: next-stable
User: jgoecks
Date: 2013-08-07 00:06:45
Summary: Update citation for Tophat2.
Affected #: 1 file
diff -r 9e8c70746ddb12df2c355e09db023dcbea0397b3 -r 3615504d0a43b511c44a4aa6a8fa520260373694 tools/ngs_rna/tophat2_wrapper.xml
--- a/tools/ngs_rna/tophat2_wrapper.xml
+++ b/tools/ngs_rna/tophat2_wrapper.xml
@@ -450,7 +450,8 @@
<help>
**Tophat Overview**
-TopHat_ is a fast splice junction mapper for RNA-Seq reads. It aligns RNA-Seq reads to mammalian-sized genomes using the ultra high-throughput short read aligner Bowtie, and then analyzes the mapping results to identify splice junctions between exons. Please cite: Trapnell, C., Pachter, L. and Salzberg, S.L. TopHat: discovering splice junctions with RNA-Seq. Bioinformatics 25, 1105-1111 (2009).
+TopHat_ is a fast splice junction mapper for RNA-Seq reads. It aligns RNA-Seq reads to mammalian-sized genomes using the ultra high-throughput short read aligner Bowtie(2), and then analyzes the mapping results to identify splice junctions between exons. Please cite: Kim D, Pertea G, Trapnell C, Pimentel H, Kelley R, and Salzberg SL. TopHat2: accurate alignment
+of transcriptomes in the presence of insertions, deletions and gene fusions. Genome Biol 14:R36, 2013.
.. _Tophat: http://tophat.cbcb.umd.edu/
https://bitbucket.org/galaxy/galaxy-central/commits/773941fd26a4/
Changeset: 773941fd26a4
User: jgoecks
Date: 2013-08-07 00:07:15
Summary: Automated merge of next-stable branch
Affected #: 1 file
diff -r 48220b09b62c17422bf81e1b623b90db41b12c72 -r 773941fd26a43c8522c3ff9977cfda5968f65505 tools/ngs_rna/tophat2_wrapper.xml
--- a/tools/ngs_rna/tophat2_wrapper.xml
+++ b/tools/ngs_rna/tophat2_wrapper.xml
@@ -452,7 +452,8 @@
<help>
**Tophat Overview**
-TopHat_ is a fast splice junction mapper for RNA-Seq reads. It aligns RNA-Seq reads to mammalian-sized genomes using the ultra high-throughput short read aligner Bowtie, and then analyzes the mapping results to identify splice junctions between exons. Please cite: Trapnell, C., Pachter, L. and Salzberg, S.L. TopHat: discovering splice junctions with RNA-Seq. Bioinformatics 25, 1105-1111 (2009).
+TopHat_ is a fast splice junction mapper for RNA-Seq reads. It aligns RNA-Seq reads to mammalian-sized genomes using the ultra high-throughput short read aligner Bowtie(2), and then analyzes the mapping results to identify splice junctions between exons. Please cite: Kim D, Pertea G, Trapnell C, Pimentel H, Kelley R, and Salzberg SL. TopHat2: accurate alignment
+of transcriptomes in the presence of insertions, deletions and gene fusions. Genome Biol 14:R36, 2013.
.. _Tophat: http://tophat.cbcb.umd.edu/
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: carlfeberhard: Visualizations framework: fix to_param optionality bug, add has_dataprovider data source test
by commits-noreply@bitbucket.org 06 Aug '13
by commits-noreply@bitbucket.org 06 Aug '13
06 Aug '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/48220b09b62c/
Changeset: 48220b09b62c
User: carlfeberhard
Date: 2013-08-07 00:00:00
Summary: Visualizations framework: fix to_param optionality bug, add has_dataprovider data source test
Affected #: 1 file
diff -r 439fecf2f288e776501807c229f54332e3b0fbdf -r 48220b09b62c17422bf81e1b623b90db41b12c72 lib/galaxy/visualization/registry.py
--- a/lib/galaxy/visualization/registry.py
+++ b/lib/galaxy/visualization/registry.py
@@ -449,6 +449,7 @@
returned[ 'tests' ] = tests
# to_params (optional, 0 or more) - tells the registry to set certain params based on the model_clas, tests
+ returned[ 'to_params' ] = {}
to_params = self.parse_to_params( xml_tree.findall( 'to_param' ) )
if to_params:
returned[ 'to_params' ] = to_params
@@ -514,6 +515,7 @@
# test_attr can be a dot separated chain of object attributes (e.g. dataset.datatype) - convert to list
#TODO: too dangerous - constrain these to some allowed list
+ #TODO: does this err if no test_attr - it should...
test_attr = test_elem.get( 'test_attr' )
test_attr = test_attr.split( self.ATTRIBUTE_SPLIT_CHAR ) if isinstance( test_attr, str ) else []
# build a lambda function that gets the desired attribute to test
@@ -523,14 +525,16 @@
test_result_type = test_elem.get( 'result_type' ) or 'string'
# test functions should be sent an object to test, and the parsed result expected from the test
- #TODO: currently, isinstance and string equivalance are the only test types supported
+ # is test_attr attribute an instance of result
if test_type == 'isinstance':
#TODO: wish we could take this further but it would mean passing in the datatypes_registry
test_fn = lambda o, result: isinstance( getter( o ), result )
+ #TODO: needs cleanup - robustiosity-nessness
+ # does the object itself have a datatype attr and does that datatype have the given dataprovider
elif test_type == 'has_dataprovider':
test_fn = lambda o, result: ( hasattr( o, 'datatype' )
- and o.datatype.has_dataprovier( result ) )
+ and o.datatype.has_dataprovider( result ) )
# default to simple (string) equilavance (coercing the test_attr to a string)
else:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: carlfeberhard: Visualization framework: add convenience mako base template
by commits-noreply@bitbucket.org 06 Aug '13
by commits-noreply@bitbucket.org 06 Aug '13
06 Aug '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/439fecf2f288/
Changeset: 439fecf2f288
User: carlfeberhard
Date: 2013-08-06 23:30:05
Summary: Visualization framework: add convenience mako base template
Affected #: 2 files
diff -r e1b3dbeadb5051227f3ea216ea560f9b10bb29ed -r 439fecf2f288e776501807c229f54332e3b0fbdf config/plugins/visualizations/visualization_base.mako
--- /dev/null
+++ b/config/plugins/visualizations/visualization_base.mako
@@ -0,0 +1,67 @@
+# -*- coding: utf-8 -*-
+<% _=n_ %>
+
+%if embedded:
+ ${self.as_embedded()}
+%else:
+ ${self.as_page()}
+%endif
+
+## render this inside another page or via ajax
+<%def name="as_embedded()">
+ ${self.stylesheets()}
+ ${self.javascripts()}
+ ${self.get_body()}
+</%def>
+
+## render this as it's own page
+<%def name="as_page()">
+<!DOCTYPE HTML>
+<html>
+ <head>
+ <title>${self.title()}</title>
+ <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
+ ${self.metas()}
+ ${self.stylesheets()}
+ ${self.javascripts()}
+ </head>
+ <body>
+ ${self.get_body()}
+ </body>
+</html>
+</%def>
+##TODO: late_javascripts
+
+## Default body
+<%def name="get_body()"></%def>
+
+## Default title
+<%def name="title()">${visualization_name}</%def>
+
+## Additional metas can be defined by templates inheriting from this one.
+<%def name="metas()"></%def>
+
+## Default stylesheets
+<%def name="stylesheets()">
+${h.css('base')}
+</%def>
+
+## Default javascripts
+<%def name="javascripts()">
+${h.js(
+ "libs/jquery/jquery",
+ "libs/jquery/jquery.migrate"
+)}
+
+<script type="text/javascript">
+ // console protection
+ window.console = window.console || {
+ log : function(){},
+ debug : function(){},
+ info : function(){},
+ warn : function(){},
+ error : function(){},
+ assert : function(){}
+ };
+</script>
+</%def>
diff -r e1b3dbeadb5051227f3ea216ea560f9b10bb29ed -r 439fecf2f288e776501807c229f54332e3b0fbdf lib/galaxy/visualization/registry.py
--- a/lib/galaxy/visualization/registry.py
+++ b/lib/galaxy/visualization/registry.py
@@ -524,10 +524,14 @@
# test functions should be sent an object to test, and the parsed result expected from the test
#TODO: currently, isinstance and string equivalance are the only test types supported
- if test_type == 'isinstance':
+ if test_type == 'isinstance':
#TODO: wish we could take this further but it would mean passing in the datatypes_registry
test_fn = lambda o, result: isinstance( getter( o ), result )
+ elif test_type == 'has_dataprovider':
+ test_fn = lambda o, result: ( hasattr( o, 'datatype' )
+ and o.datatype.has_dataprovier( result ) )
+
# default to simple (string) equilavance (coercing the test_attr to a string)
else:
test_fn = lambda o, result: str( getter( o ) ) == result
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: carlfeberhard: Visualization framework: update universe_wsgi vis plugin entry with absolute path explanation
by commits-noreply@bitbucket.org 06 Aug '13
by commits-noreply@bitbucket.org 06 Aug '13
06 Aug '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/e1b3dbeadb50/
Changeset: e1b3dbeadb50
User: carlfeberhard
Date: 2013-08-06 23:15:17
Summary: Visualization framework: update universe_wsgi vis plugin entry with absolute path explanation
Affected #: 1 file
diff -r 26467a120cb1d3b8868138d5233602c831db6e79 -r e1b3dbeadb5051227f3ea216ea560f9b10bb29ed universe_wsgi.ini.sample
--- a/universe_wsgi.ini.sample
+++ b/universe_wsgi.ini.sample
@@ -175,6 +175,8 @@
#datatypes_config_file = datatypes_conf.xml
# Visualizations config directory: where to look for individual visualization plugins.
+# The path is relative to the Galaxy root dir. To use an absolute path begin the path
+# with '/'.
#visualizations_plugins_directory = config/plugins/visualizations
# Each job is given a unique empty directory as its current working directory.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/9e8c70746ddb/
Changeset: 9e8c70746ddb
Branch: next-stable
User: jgoecks
Date: 2013-08-06 22:34:20
Summary: Add Bowtie2 wrapper to main's tool conf.
Affected #: 1 file
diff -r e3f6a95109789272a3c4f0663dbcdab908aa8f26 -r 9e8c70746ddb12df2c355e09db023dcbea0397b3 tool_conf.xml.main
--- a/tool_conf.xml.main
+++ b/tool_conf.xml.main
@@ -260,6 +260,7 @@
</section><section name="NGS: Mapping" id="ngs_mapping"><label text="Illumina" id="illumina"/>
+ <tool file="sr_mapping/bowtie2_wrapper.xml" /><label text="Roche-454" id="roche_454"/><tool file="metag_tools/megablast_wrapper.xml" /><tool file="metag_tools/megablast_xml_parser.xml" />
https://bitbucket.org/galaxy/galaxy-central/commits/26467a120cb1/
Changeset: 26467a120cb1
User: jgoecks
Date: 2013-08-06 22:34:50
Summary: Automated merge of next-stable branch
Affected #: 1 file
diff -r 12075315e61b01f4b163a1074781eabadc47da00 -r 26467a120cb1d3b8868138d5233602c831db6e79 tool_conf.xml.main
--- a/tool_conf.xml.main
+++ b/tool_conf.xml.main
@@ -260,6 +260,7 @@
</section><section name="NGS: Mapping" id="ngs_mapping"><label text="Illumina" id="illumina"/>
+ <tool file="sr_mapping/bowtie2_wrapper.xml" /><label text="Roche-454" id="roche_454"/><tool file="metag_tools/megablast_wrapper.xml" /><tool file="metag_tools/megablast_xml_parser.xml" />
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dannon: Fix get_hda_list 'get_accessible' variable naming -- previously would blow up as undefined.
by commits-noreply@bitbucket.org 06 Aug '13
by commits-noreply@bitbucket.org 06 Aug '13
06 Aug '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/12075315e61b/
Changeset: 12075315e61b
User: dannon
Date: 2013-08-06 22:05:09
Summary: Fix get_hda_list 'get_accessible' variable naming -- previously would blow up as undefined.
Affected #: 1 file
diff -r e08148b86a59421f6faa616529ae262a40138485 -r 12075315e61b01f4b163a1074781eabadc47da00 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -537,7 +537,9 @@
hda = None
try:
hda = self.get_dataset( trans, id,
- check_ownership=check_ownership, check_accesible=check_accesible, check_state=check_state )
+ check_ownership=check_ownership,
+ check_accessible=check_accessible,
+ check_state=check_state )
except Exception, exception:
pass
hdas.append( hda )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dannon: Cleanup imports in extended_metadata API controller.
by commits-noreply@bitbucket.org 06 Aug '13
by commits-noreply@bitbucket.org 06 Aug '13
06 Aug '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/e08148b86a59/
Changeset: e08148b86a59
User: dannon
Date: 2013-08-06 22:02:41
Summary: Cleanup imports in extended_metadata API controller.
Affected #: 1 file
diff -r 42ef1d78a2f8e2466a0698ed4b5343689da4cd6f -r e08148b86a59421f6faa616529ae262a40138485 lib/galaxy/webapps/galaxy/api/extended_metadata.py
--- a/lib/galaxy/webapps/galaxy/api/extended_metadata.py
+++ b/lib/galaxy/webapps/galaxy/api/extended_metadata.py
@@ -1,17 +1,9 @@
"""
API operations on annotations.
"""
-import logging, os, string, shutil, urllib, re, socket
-from cgi import escape, FieldStorage
-from galaxy import util, datatypes, jobs, web, util
-from galaxy.web.base.controller import BaseAPIController, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin, UsesExtendedMetadataMixin
-from galaxy.util.sanitize_html import sanitize_html
-import galaxy.datatypes
-from galaxy.util.bunch import Bunch
-
-import pkg_resources
-pkg_resources.require( "Routes" )
-import routes
+import logging
+from galaxy import web
+from galaxy.web.base.controller import BaseAPIController, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin, UsesExtendedMetadataMixin, HTTPNotImplemented
log = logging.getLogger( __name__ )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/42ef1d78a2f8/
Changeset: 42ef1d78a2f8
User: dannon
Date: 2013-08-06 21:57:27
Summary: Remove relpath added in pr#149 -- not needed w/ python 2.6+
Affected #: 1 file
diff -r 65c255d9c0a56099261a23c7ef1eb5185481ece0 -r 42ef1d78a2f8e2466a0698ed4b5343689da4cd6f lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -665,41 +665,6 @@
print "ERROR: Unable to read builds for site file %s" %filename
return build_sites
-def relpath( path, start = None ):
- """Return a relative version of a path"""
- #modified from python 2.6.1 source code
-
- #version 2.6+ has it built in, we'll use the 'official' copy
- if sys.version_info[:2] >= ( 2, 6 ):
- if start is not None:
- return os.path.relpath( path, start )
- return os.path.relpath( path )
-
- #we need to initialize some local parameters
- curdir = os.curdir
- pardir = os.pardir
- sep = os.sep
- commonprefix = os.path.commonprefix
- join = os.path.join
- if start is None:
- start = curdir
-
- #below is the unedited (but formated) relpath() from posixpath.py of 2.6.1
- #this will likely not function properly on non-posix systems, i.e. windows
- if not path:
- raise ValueError( "no path specified" )
-
- start_list = os.path.abspath( start ).split( sep )
- path_list = os.path.abspath( path ).split( sep )
-
- # Work out how much of the filepath is shared by start and path.
- i = len( commonprefix( [ start_list, path_list ] ) )
-
- rel_list = [ pardir ] * ( len( start_list )- i ) + path_list[ i: ]
- if not rel_list:
- return curdir
- return join( *rel_list )
-
def relativize_symlinks( path, start=None, followlinks=False):
for root, dirs, files in os.walk( path, followlinks=followlinks ):
rel_start = None
https://bitbucket.org/galaxy/galaxy-central/commits/a849924d6f9d/
Changeset: a849924d6f9d
Branch: extended_metadata
User: dannon
Date: 2013-08-06 21:58:00
Summary: Close extended_metadata branch
Affected #: 0 files
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
5 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/970d44713796/
Changeset: 970d44713796
User: kellrott
Date: 2012-12-28 01:38:46
Summary: Adding API for accessing extended metadata
Affected #: 3 files
diff -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 -r 970d44713796453e3356c6273213bccf3eec7f38 lib/galaxy/model/item_attrs.py
--- a/lib/galaxy/model/item_attrs.py
+++ b/lib/galaxy/model/item_attrs.py
@@ -152,6 +152,83 @@
class_name = '%sAnnotationAssociation' % item.__class__.__name__
return getattr( galaxy.model, class_name, None )
+
+class UsesExtendedMetadata:
+ """ Mixin for getting and setting item extended metadata. """
+
+ def get_item_extended_metadata_obj( self, db_session, user, item ):
+ """
+ Given an item object (such as a LibraryDatasetDatasetAssociation), find the object
+ of the associated extended metadata
+ """
+
+ extended_metadata = db_session.query( galaxy.model.ExtendedMetadata )
+
+ if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
+ extended_metadata = extended_metadata.join( galaxy.model.LibraryDatasetDatasetAssociation ).filter(
+ galaxy.model.LibraryDatasetDatasetAssociation.id == item.id )
+
+ return extended_metadata.first()
+
+
+ def set_item_extended_metadata_obj( self, db_session, user, item, extmeta_obj):
+ if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
+ item.extended_metadata = extmeta_obj
+ db_session.flush()
+
+ def unlink_item_extended_metadata_obj( self, db_session, user, item):
+ if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
+ item.extended_metadata = None
+ db_session.flush()
+
+ def create_extended_metadata(self, db_session, user, extmeta):
+ """
+ Create/index an extended metadata object. The returned object is
+ not associated with any items
+ """
+ ex_meta = galaxy.model.ExtendedMetadata(extmeta)
+ db_session.add( ex_meta )
+ db_session.flush()
+ for path, value in self._scan_json_block(extmeta):
+ meta_i = galaxy.model.ExtendedMetadataIndex(ex_meta, path, value)
+ db_session.add(meta_i)
+ db_session.flush()
+ return ex_meta
+
+ def delete_extended_metadata( self, db_session, user, item):
+ if item.__class__ == galaxy.model.ExtendedMetadata:
+ db_session.delete( item )
+ db_session.flush()
+
+ def _scan_json_block(self, meta, prefix=""):
+ """
+ Scan a json style data structure, and emit all fields and their values.
+ Example paths
+
+ Data
+ { "data" : [ 1, 2, 3 ] }
+
+ Path:
+ /data == [1,2,3]
+
+ /data/[0] == 1
+
+ """
+ if isinstance(meta, dict):
+ for a in meta:
+ for path, value in self._scan_json_block(meta[a], prefix + "/" + a):
+ yield path, value
+ elif isinstance(meta, list):
+ for i, a in enumerate(meta):
+ for path, value in self._scan_json_block(a, prefix + "[%d]" % (i)):
+ yield path, value
+ else:
+ #BUG: Everything is cast to string, which can lead to false positives
+ #for cross type comparisions, ie "True" == True
+ yield prefix, ("%s" % (meta)).encode("utf8", errors='replace')
+
+
+
class APIItem:
""" Mixin for api representation. """
#api_collection_visible_keys = ( 'id' )
diff -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 -r 970d44713796453e3356c6273213bccf3eec7f38 lib/galaxy/webapps/galaxy/api/extended_metadata.py
--- /dev/null
+++ b/lib/galaxy/webapps/galaxy/api/extended_metadata.py
@@ -0,0 +1,63 @@
+"""
+API operations on annotations.
+"""
+import logging, os, string, shutil, urllib, re, socket
+from cgi import escape, FieldStorage
+from galaxy import util, datatypes, jobs, web, util
+from galaxy.web.base.controller import *
+from galaxy.model.item_attrs import *
+from galaxy.util.sanitize_html import sanitize_html
+from galaxy.model.orm import *
+import galaxy.datatypes
+from galaxy.util.bunch import Bunch
+
+import pkg_resources
+pkg_resources.require( "Routes" )
+import routes
+
+log = logging.getLogger( __name__ )
+
+class BaseExtendedMetadataController( BaseAPIController, UsesExtendedMetadata, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin ):
+
+ @web.expose_api
+ def index( self, trans, **kwd ):
+ idnum = kwd[self.exmeta_item_id]
+ item = self._get_item_from_id(trans, idnum)
+ if item is not None:
+ ex_meta = self.get_item_extended_metadata_obj( trans.sa_session, trans.get_user(), item )
+ if ex_meta is not None:
+ return ex_meta.data
+
+ @web.expose_api
+ def create( self, trans, payload, **kwd ):
+ idnum = kwd[self.exmeta_item_id]
+ item = self._get_item_from_id(trans, idnum)
+ if item is not None:
+ ex_obj = self.get_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
+ if ex_obj is not None:
+ self.unlink_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
+ self.delete_extended_metadata(trans.sa_session, trans.get_user(), ex_obj)
+ ex_obj = self.create_extended_metadata(trans.sa_session, trans.get_user(), payload)
+ self.set_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item, ex_obj)
+
+ @web.expose_api
+ def delete( self, trans, **kwd ):
+ idnum = kwd[self.tagged_item_id]
+ item = self._get_item_from_id(trans, idnum)
+ if item is not None:
+ ex_obj = self.get_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
+ if ex_obj is not None:
+ self.unlink_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
+ self.delete_extended_metadata(trans.sa_session, trans.get_user(), ex_obj)
+
+ @web.expose_api
+ def undelete( self, trans, **kwd ):
+ raise HTTPNotImplemented()
+
+class LibraryDatasetExtendMetadataController(BaseExtendedMetadataController):
+ controller_name = "library_dataset_extended_metadata"
+ exmeta_item_id = "library_content_id"
+ def _get_item_from_id(self, trans, idstr):
+ hist = self.get_library_dataset_dataset_association( trans, idstr )
+ return hist
+
diff -r 7073d786cad0f0d251d4ea9b2c42171f698cf953 -r 970d44713796453e3356c6273213bccf3eec7f38 lib/galaxy/webapps/galaxy/buildapp.py
--- a/lib/galaxy/webapps/galaxy/buildapp.py
+++ b/lib/galaxy/webapps/galaxy/buildapp.py
@@ -107,6 +107,11 @@
_add_item_tags_controller( webapp,
name_prefix="workflow_",
path_prefix='/api/workflows/:workflow_id' )
+
+ _add_item_extended_metadata_controller( webapp,
+ name_prefix="library_dataset_",
+ path_prefix='/api/libraries/:library_id/contents/:library_content_id' )
+
webapp.api_mapper.resource( 'dataset', 'datasets', path_prefix='/api' )
webapp.api_mapper.resource_with_deleted( 'library', 'libraries', path_prefix='/api' )
webapp.api_mapper.resource( 'sample', 'samples', path_prefix='/api' )
@@ -187,6 +192,12 @@
conditions=dict(method=["GET"]))
+def _add_item_extended_metadata_controller( webapp, name_prefix, path_prefix, **kwd ):
+ controller = "%sextended_metadata" % name_prefix
+ name = "%sextended_metadata" % name_prefix
+ webapp.api_mapper.resource(name, "extended_metadata", path_prefix=path_prefix, controller=controller)
+
+
def wrap_in_middleware( app, global_conf, **local_conf ):
"""
Based on the configuration wrap `app` in a set of common and useful
https://bitbucket.org/galaxy/galaxy-central/commits/7b916e959c09/
Changeset: 7b916e959c09
User: kellrott
Date: 2013-01-18 17:46:16
Summary: Removing the '*' imports from the extended_metadata API module
Affected #: 1 file
diff -r 970d44713796453e3356c6273213bccf3eec7f38 -r 7b916e959c09083e541d1daaf573b1af995ed152 lib/galaxy/webapps/galaxy/api/extended_metadata.py
--- a/lib/galaxy/webapps/galaxy/api/extended_metadata.py
+++ b/lib/galaxy/webapps/galaxy/api/extended_metadata.py
@@ -4,10 +4,9 @@
import logging, os, string, shutil, urllib, re, socket
from cgi import escape, FieldStorage
from galaxy import util, datatypes, jobs, web, util
-from galaxy.web.base.controller import *
-from galaxy.model.item_attrs import *
+from galaxy.web.base.controller import BaseAPIController, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin
+from galaxy.model.item_attrs import UsesExtendedMetadata
from galaxy.util.sanitize_html import sanitize_html
-from galaxy.model.orm import *
import galaxy.datatypes
from galaxy.util.bunch import Bunch
https://bitbucket.org/galaxy/galaxy-central/commits/d603df6f2e79/
Changeset: d603df6f2e79
User: kellrott
Date: 2013-01-23 20:29:30
Summary: Central Merge
Affected #: 248 files
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 .hgignore
--- a/.hgignore
+++ b/.hgignore
@@ -16,6 +16,7 @@
database/community_files
database/compiled_templates
database/files
+database/job_working_directory
database/pbs
database/tmp
database/*.sqlite
@@ -23,6 +24,11 @@
# Python bytecode
*.pyc
+# Tool Shed Runtime Files
+community_webapp.log
+community_webapp.pid
+hgweb.config*
+
# Config files
universe_wsgi.ini
reports_wsgi.ini
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 LICENSE.txt
--- a/LICENSE.txt
+++ b/LICENSE.txt
@@ -1,25 +1,186 @@
-Copyright (c) 2005 Pennsylvania State University
+Copyright (c) 2005-2013 Pennsylvania State University
-Permission is hereby granted, free of charge, to any person obtaining
-a copy of this software and associated documentation files (the
-"Software"), to deal in the Software without restriction, including
-without limitation the rights to use, copy, modify, merge, publish,
-distribute, sublicense, and/or sell copies of the Software, and to
-permit persons to whom the Software is furnished to do so, subject to
-the following conditions:
+Licensed under the Academic Free License version 3.0
-The above copyright notice and this permission notice shall be
-included in all copies or substantial portions of the Software.
+ 1) Grant of Copyright License. Licensor grants You a worldwide, royalty-free,
+ non-exclusive, sublicensable license, for the duration of the copyright, to
+ do the following:
-THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
-EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
-MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
-IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
-CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
-TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
-SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+ a) to reproduce the Original Work in copies, either alone or as part of a
+ collective work;
+
+ b) to translate, adapt, alter, transform, modify, or arrange the Original
+ Work, thereby creating derivative works ("Derivative Works") based upon
+ the Original Work;
+
+ c) to distribute or communicate copies of the Original Work and Derivative
+ Works to the public, under any license of your choice that does not
+ contradict the terms and conditions, including Licensor's reserved
+ rights and remedies, in this Academic Free License;
+
+ d) to perform the Original Work publicly; and
+
+ e) to display the Original Work publicly.
+
+ 2) Grant of Patent License. Licensor grants You a worldwide, royalty-free,
+ non-exclusive, sublicensable license, under patent claims owned or
+ controlled by the Licensor that are embodied in the Original Work as
+ furnished by the Licensor, for the duration of the patents, to make, use,
+ sell, offer for sale, have made, and import the Original Work and
+ Derivative Works.
+
+ 3) Grant of Source Code License. The term "Source Code" means the preferred
+ form of the Original Work for making modifications to it and all available
+ documentation describing how to modify the Original Work. Licensor agrees
+ to provide a machine-readable copy of the Source Code of the Original Work
+ along with each copy of the Original Work that Licensor distributes.
+ Licensor reserves the right to satisfy this obligation by placing a
+ machine-readable copy of the Source Code in an information repository
+ reasonably calculated to permit inexpensive and convenient access by You
+ for as long as Licensor continues to distribute the Original Work.
+
+ 4) Exclusions From License Grant. Neither the names of Licensor, nor the
+ names of any contributors to the Original Work, nor any of their
+ trademarks or service marks, may be used to endorse or promote products
+ derived from this Original Work without express prior permission of the
+ Licensor. Except as expressly stated herein, nothing in this License
+ grants any license to Licensor's trademarks, copyrights, patents, trade
+ secrets or any other intellectual property. No patent license is granted
+ to make, use, sell, offer for sale, have made, or import embodiments of
+ any patent claims other than the licensed claims defined in Section 2.
+ No license is granted to the trademarks of Licensor even if such marks
+ are included in the Original Work. Nothing in this License shall be
+ interpreted to prohibit Licensor from licensing under terms different
+ from this License any Original Work that Licensor otherwise would have a
+ right to license.
+
+ 5) External Deployment. The term "External Deployment" means the use,
+ distribution, or communication of the Original Work or Derivative Works
+ in any way such that the Original Work or Derivative Works may be used by
+ anyone other than You, whether those works are distributed or
+ communicated to those persons or made available as an application
+ intended for use over a network. As an express condition for the grants
+ of license hereunder, You must treat any External Deployment by You of
+ the Original Work or a Derivative Work as a distribution under
+ section 1(c).
+
+ 6) Attribution Rights. You must retain, in the Source Code of any Derivative
+ Works that You create, all copyright, patent, or trademark notices from
+ the Source Code of the Original Work, as well as any notices of licensing
+ and any descriptive text identified therein as an "Attribution Notice."
+ You must cause the Source Code for any Derivative Works that You create
+ to carry a prominent Attribution Notice reasonably calculated to inform
+ recipients that You have modified the Original Work.
+
+ 7) Warranty of Provenance and Disclaimer of Warranty. Licensor warrants that
+ the copyright in and to the Original Work and the patent rights granted
+ herein by Licensor are owned by the Licensor or are sublicensed to You
+ under the terms of this License with the permission of the contributor(s)
+ of those copyrights and patent rights. Except as expressly stated in the
+ immediately preceding sentence, the Original Work is provided under this
+ License on an "AS IS" BASIS and WITHOUT WARRANTY, either express or
+ implied, including, without limitation, the warranties of
+ non-infringement, merchantability or fitness for a particular purpose.
+ THE ENTIRE RISK AS TO THE QUALITY OF THE ORIGINAL WORK IS WITH YOU. This
+ DISCLAIMER OF WARRANTY constitutes an essential part of this License.
+ No license to the Original Work is granted by this License except under
+ this disclaimer.
+
+ 8) Limitation of Liability. Under no circumstances and under no legal
+ theory, whether in tort (including negligence), contract, or otherwise,
+ shall the Licensor be liable to anyone for any indirect, special,
+ incidental, or consequential damages of any character arising as a result
+ of this License or the use of the Original Work including, without
+ limitation, damages for loss of goodwill, work stoppage, computer failure
+ or malfunction, or any and all other commercial damages or losses. This
+ limitation of liability shall not apply to the extent applicable law
+ prohibits such limitation.
+
+ 9) Acceptance and Termination. If, at any time, You expressly assented to
+ this License, that assent indicates your clear and irrevocable acceptance
+ of this License and all of its terms and conditions. If You distribute or
+ communicate copies of the Original Work or a Derivative Work, You must
+ make a reasonable effort under the circumstances to obtain the express
+ assent of recipients to the terms of this License. This License
+ conditions your rights to undertake the activities listed in Section 1,
+ including your right to create Derivative Works based upon the Original
+ Work, and doing so without honoring these terms and conditions is
+ prohibited by copyright law and international treaty. Nothing in this
+ License is intended to affect copyright exceptions and limitations
+ (including "fair use" or "fair dealing"). This License shall terminate
+ immediately and You may no longer exercise any of the rights granted to
+ You by this License upon your failure to honor the conditions in
+ Section 1(c).
+
+10) Termination for Patent Action. This License shall terminate
+ automatically and You may no longer exercise any of the rights granted
+ to You by this License as of the date You commence an action, including
+ a cross-claim or counterclaim, against Licensor or any licensee alleging
+ that the Original Work infringes a patent. This termination provision
+ shall not apply for an action alleging patent infringement by
+ combinations of the Original Work with other software or hardware.
+
+11) Jurisdiction, Venue and Governing Law. Any action or suit relating to
+ this License may be brought only in the courts of a jurisdiction wherein
+ the Licensor resides or in which Licensor conducts its primary business,
+ and under the laws of that jurisdiction excluding its conflict-of-law
+ provisions. The application of the United Nations Convention on
+ Contracts for the International Sale of Goods is expressly excluded. Any
+ use of the Original Work outside the scope of this License or after its
+ termination shall be subject to the requirements and penalties of
+ copyright or patent law in the appropriate jurisdiction. This section
+ shall survive the termination of this License.
+
+12) Attorneys' Fees. In any action to enforce the terms of this License or
+ seeking damages relating thereto, the prevailing party shall be entitled
+ to recover its costs and expenses, including, without limitation,
+ reasonable attorneys' fees and costs incurred in connection with such
+ action, including any appeal of such action. This section shall survive
+ the termination of this License.
+
+13) Miscellaneous. If any provision of this License is held to be
+ unenforceable, such provision shall be reformed only to the extent
+ necessary to make it enforceable.
+
+14) Definition of "You" in This License. "You" throughout this License,
+ whether in upper or lower case, means an individual or a legal entity
+ exercising rights under, and complying with all of the terms of, this
+ License. For legal entities, "You" includes any entity that controls, is
+ controlled by, or is under common control with you. For purposes of this
+ definition, "control" means (i) the power, direct or indirect, to cause
+ the direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+15) Right to Use. You may use the Original Work in all ways not otherwise
+ restricted or conditioned by this License or by law, and Licensor
+ promises not to interfere with or be responsible for such uses by You.
+
+16) Modification of This License. This License is Copyright © 2005 Lawrence
+ Rosen. Permission is granted to copy, distribute, or communicate this
+ License without modification. Nothing in this License permits You to
+ modify this License as applied to the Original Work or to Derivative
+ Works. However, You may modify the text of this License and copy,
+ distribute or communicate your modified version (the "Modified
+ License") and apply it to other original works of authorship subject to
+ the following conditions: (i) You may not indicate in any way that your
+ Modified License is the "Academic Free License" or "AFL" and you may not
+ use those names in the name of your Modified License; (ii) You must
+ replace the notice specified in the first paragraph above with the
+ notice "Licensed under <insert your license name here>" or with a notice
+ of your own that is not confusingly similar to the notice in this
+ License; and (iii) You may not claim that your original works are open
+ source software unless your Modified License has been approved by Open
+ Source Initiative (OSI) and You comply with its license review and
+ certification process.
+
Some icons found in Galaxy are from the Silk Icons set, available under
the Creative Commons Attribution 2.5 License, from:
http://www.famfamfam.com/lab/icons/silk/
+
+
+Other images and documentation are licensed under the Creative Commons Attribution 3.0 (CC BY 3.0) License. See
+
+http://creativecommons.org/licenses/by/3.0/
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 doc/source/lib/galaxy.jobs.runners.rst
--- a/doc/source/lib/galaxy.jobs.runners.rst
+++ b/doc/source/lib/galaxy.jobs.runners.rst
@@ -72,4 +72,5 @@
galaxy.jobs.runners.cli_job
galaxy.jobs.runners.cli_shell
+ galaxy.jobs.runners.lwr_client
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 doc/source/lib/galaxy.util.rst
--- a/doc/source/lib/galaxy.util.rst
+++ b/doc/source/lib/galaxy.util.rst
@@ -159,4 +159,5 @@
.. toctree::
galaxy.util.backports
+ galaxy.util.log
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 doc/source/lib/galaxy.webapps.community.controllers.rst
--- a/doc/source/lib/galaxy.webapps.community.controllers.rst
+++ b/doc/source/lib/galaxy.webapps.community.controllers.rst
@@ -65,11 +65,3 @@
:undoc-members:
:show-inheritance:
-:mod:`workflow` Module
-----------------------
-
-.. automodule:: galaxy.webapps.community.controllers.workflow
- :members:
- :undoc-members:
- :show-inheritance:
-
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 doc/source/lib/galaxy.webapps.community.util.rst
--- a/doc/source/lib/galaxy.webapps.community.util.rst
+++ b/doc/source/lib/galaxy.webapps.community.util.rst
@@ -25,3 +25,11 @@
:undoc-members:
:show-inheritance:
+:mod:`workflow_util` Module
+---------------------------
+
+.. automodule:: galaxy.webapps.community.util.workflow_util
+ :members:
+ :undoc-members:
+ :show-inheritance:
+
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 doc/source/lib/galaxy.webapps.galaxy.api.rst
--- a/doc/source/lib/galaxy.webapps.galaxy.api.rst
+++ b/doc/source/lib/galaxy.webapps.galaxy.api.rst
@@ -23,10 +23,6 @@
Quickstart
==========
-Set the following option in universe_wsgi.ini and start the server::
-
- enable_api = True
-
Log in as your user, navigate to the API Keys page in the User menu, and
generate a new API key. Make a note of the API key, and then pull up a
terminal. Now we'll use the display.py script in your galaxy/scripts/api
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 eggs.ini
--- a/eggs.ini
+++ b/eggs.ini
@@ -29,6 +29,7 @@
simplejson = 2.1.1
threadframe = 0.2
guppy = 0.1.8
+; msgpack_python = 0.2.4
[eggs:noplatform]
amqplib = 0.6.1
@@ -65,6 +66,7 @@
Babel = 0.9.4
wchartype = 0.1
Whoosh = 0.3.18
+; fluent_logger = 0.3.3
; extra version information
[tags]
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/app.py
--- a/lib/galaxy/app.py
+++ b/lib/galaxy/app.py
@@ -29,6 +29,7 @@
self.config = config.Configuration( **kwargs )
self.config.check()
config.configure_logging( self.config )
+ self.configure_fluent_log()
# Determine the database url
if self.config.database_connection:
db_url = self.config.database_connection
@@ -54,7 +55,8 @@
db_url,
self.config.database_engine_options,
database_query_profiling_proxy = self.config.database_query_profiling_proxy,
- object_store = self.object_store )
+ object_store = self.object_store,
+ trace_logger=self.trace_logger )
# Manage installed tool shed repositories.
self.installed_repository_manager = galaxy.tool_shed.InstalledRepositoryManager( self )
# Create an empty datatypes registry.
@@ -149,6 +151,7 @@
self.job_stop_queue = self.job_manager.job_stop_queue
# Initialize the external service types
self.external_service_types = external_service_types.ExternalServiceTypesCollection( self.config.external_service_type_config_file, self.config.external_service_type_path, self )
+
def shutdown( self ):
self.job_manager.shutdown()
self.object_store.shutdown()
@@ -161,3 +164,10 @@
os.unlink( self.datatypes_registry.integrated_datatypes_configs )
except:
pass
+
+ def configure_fluent_log( self ):
+ if self.config.fluent_log:
+ from galaxy.util.log.fluent_log import FluentTraceLogger
+ self.trace_logger = FluentTraceLogger( 'galaxy', self.config.fluent_host, self.config.fluent_port )
+ else:
+ self.trace_logger = None
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/config.py
--- a/lib/galaxy/config.py
+++ b/lib/galaxy/config.py
@@ -261,6 +261,10 @@
self.api_folders = string_as_bool( kwargs.get( 'api_folders', False ) )
# This is for testing new library browsing capabilities.
self.new_lib_browse = string_as_bool( kwargs.get( 'new_lib_browse', False ) )
+ # Logging with fluentd
+ self.fluent_log = string_as_bool( kwargs.get( 'fluent_log', False ) )
+ self.fluent_host = kwargs.get( 'fluent_host', 'localhost' )
+ self.fluent_port = int( kwargs.get( 'fluent_port', 24224 ) )
def __read_tool_job_config( self, global_conf_parser, section, key ):
try:
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/datatypes/registry.py
--- a/lib/galaxy/datatypes/registry.py
+++ b/lib/galaxy/datatypes/registry.py
@@ -114,7 +114,8 @@
self.datatype_elems.remove( in_memory_elem )
else:
# Keep an in-memory list of datatype elems to enable persistence.
- self.datatype_elems.append( elem )
+ if extension not in self.datatypes_by_extension:
+ self.datatype_elems.append( elem )
if extension and extension in self.datatypes_by_extension and deactivate:
# We are deactivating an installed tool shed repository, so eliminate the datatype from the registry.
# TODO: Handle deactivating datatype converters, etc before removing from self.datatypes_by_extension.
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/datatypes/tabular.py
--- a/lib/galaxy/datatypes/tabular.py
+++ b/lib/galaxy/datatypes/tabular.py
@@ -13,7 +13,7 @@
from galaxy.datatypes import metadata
from galaxy.datatypes.checkers import is_gzip
from galaxy.datatypes.metadata import MetadataElement
-from galaxy.datatypes.sniff import get_headers
+from galaxy.datatypes.sniff import get_headers, get_test_fname
from galaxy.util.json import to_json_string
log = logging.getLogger(__name__)
@@ -267,20 +267,20 @@
def display_data(self, trans, dataset, preview=False, filename=None, to_ext=None, chunk=None):
if chunk:
return self.get_chunk(trans, dataset, chunk)
+ elif to_ext or not preview:
+ return self._serve_raw(trans, dataset, to_ext)
elif dataset.metadata.columns > 50:
#Fancy tabular display is only suitable for datasets without an incredibly large number of columns.
#We should add a new datatype 'matrix', with it's own draw method, suitable for this kind of data.
#For now, default to the old behavior, ugly as it is. Remove this after adding 'matrix'.
max_peek_size = 1000000 # 1 MB
- if not preview or os.stat( dataset.file_name ).st_size < max_peek_size:
+ if os.stat( dataset.file_name ).st_size < max_peek_size:
return open( dataset.file_name )
else:
trans.response.set_content_type( "text/html" )
return trans.stream_template_mako( "/dataset/large_file.mako",
truncated_data = open( dataset.file_name ).read(max_peek_size),
data = dataset)
- elif to_ext or not preview:
- return self._serve_raw(trans, dataset, to_ext)
else:
column_names = 'null'
if dataset.metadata.column_names:
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/jobs/runners/__init__.py
--- a/lib/galaxy/jobs/runners/__init__.py
+++ b/lib/galaxy/jobs/runners/__init__.py
@@ -152,6 +152,7 @@
nworkers = self.app.config.cluster_job_queue_workers
for i in range( nworkers ):
worker = threading.Thread( name="%s.work_thread-%d" % (self.runner_name, i), target=self.run_next )
+ worker.setDaemon( True )
worker.start()
self.work_threads.append( worker )
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/jobs/runners/drmaa.py
--- a/lib/galaxy/jobs/runners/drmaa.py
+++ b/lib/galaxy/jobs/runners/drmaa.py
@@ -400,16 +400,18 @@
def stop_job( self, job ):
"""Attempts to delete a job from the DRM queue"""
try:
+ ext_id = job.get_job_runner_external_id()
+ assert ext_id not in ( None, 'None' ), 'External job id is None'
if self.external_killJob_script is None:
- self.ds.control( job.get_job_runner_external_id(), drmaa.JobControlAction.TERMINATE )
+ self.ds.control( ext_id, drmaa.JobControlAction.TERMINATE )
else:
# FIXME: hardcoded path
- subprocess.Popen( [ '/usr/bin/sudo', '-E', self.external_killJob_script, str( job.get_job_runner_external_id() ), str( self.userid ) ], shell=False )
- log.debug( "(%s/%s) Removed from DRM queue at user's request" % ( job.get_id(), job.get_job_runner_external_id() ) )
+ subprocess.Popen( [ '/usr/bin/sudo', '-E', self.external_killJob_script, str( ext_id ), str( self.userid ) ], shell=False )
+ log.debug( "(%s/%s) Removed from DRM queue at user's request" % ( job.get_id(), ext_id ) )
except drmaa.InvalidJobException:
- log.debug( "(%s/%s) User killed running job, but it was already dead" % ( job.get_id(), job.get_job_runner_external_id() ) )
+ log.debug( "(%s/%s) User killed running job, but it was already dead" % ( job.get_id(), ext_id ) )
except Exception, e:
- log.debug( "(%s/%s) User killed running job, but error encountered removing from DRM queue: %s" % ( job.get_id(), job.get_job_runner_external_id(), e ) )
+ log.debug( "(%s/%s) User killed running job, but error encountered removing from DRM queue: %s" % ( job.get_id(), ext_id, e ) )
def recover( self, job, job_wrapper ):
"""Recovers jobs stuck in the queued/running state when Galaxy started"""
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/jobs/runners/lwr.py
--- a/lib/galaxy/jobs/runners/lwr.py
+++ b/lib/galaxy/jobs/runners/lwr.py
@@ -2,7 +2,7 @@
import subprocess
from galaxy import model
-from galaxy.jobs.runners import ClusterJobState, ClusterJobRunner
+from galaxy.jobs.runners import ClusterJobState, ClusterJobRunner, STOP_SIGNAL
import errno
from time import sleep
@@ -174,8 +174,9 @@
def shutdown( self ):
"""Attempts to gracefully shut down the worker threads"""
log.info( "sending stop signal to worker threads" )
- for i in range( len( self.threads ) ):
- self.queue.put( self.STOP_SIGNAL )
+ self.monitor_queue.put( STOP_SIGNAL )
+ for i in range( len( self.work_threads ) ):
+ self.work_queue.put( ( STOP_SIGNAL, None ) )
log.info( "local job runner stopped" )
def check_pid( self, pid ):
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -1335,7 +1335,9 @@
# Loop through sources until viable one is found.
for source in source_list:
msg = self.convert_dataset( trans, source )
- if msg == self.conversion_messages.PENDING:
+ # No message or PENDING means that source is viable. No
+ # message indicates conversion was done and is successful.
+ if not msg or msg == self.conversion_messages.PENDING:
data_source = source
break
@@ -3150,22 +3152,32 @@
return self.deleted
@property
def has_repository_dependencies( self ):
- return self.metadata and 'repository_dependencies' in self.metadata
+ if self.metadata:
+ return 'repository_dependencies' in self.metadata
+ return False
@property
def includes_tools( self ):
- return self.metadata and 'tools' in self.metadata
+ if self.metadata:
+ return 'tools' in self.metadata
+ return False
@property
def includes_tool_dependencies( self ):
- return self.metadata and 'tool_dependencies' in self.metadata
+ if self.metadata:
+ return 'tool_dependencies' in self.metadata
+ return False
@property
def includes_workflows( self ):
- return self.metadata and 'workflows' in self.metadata
+ if self.metadata:
+ return 'workflows' in self.metadata
+ return False
@property
def in_error_state( self ):
return self.status == self.installation_status.ERROR
@property
def has_readme_files( self ):
- return self.metadata and 'readme_files' in self.metadata
+ if self.metadata:
+ return 'readme_files' in self.metadata
+ return False
@property
def repository_dependencies( self ):
required_repositories = []
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/model/item_attrs.py
--- a/lib/galaxy/model/item_attrs.py
+++ b/lib/galaxy/model/item_attrs.py
@@ -137,6 +137,12 @@
# Set annotation.
annotation_assoc.annotation = annotation
return annotation_assoc
+
+ def delete_item_annotation( self, db_session, user, item):
+ annotation_assoc = self.get_item_annotation_obj( db_session, user, item )
+ if annotation_assoc:
+ db_session.delete(annotation_assoc)
+ db_session.flush()
def copy_item_annotation( self, db_session, source_user, source_item, target_user, target_item ):
""" Copy an annotation from a user/item source to a user/item target. """
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py
+++ b/lib/galaxy/model/mapping.py
@@ -1950,7 +1950,7 @@
# Let this go, it could possibly work with db's we don't support
log.error( "database_connection contains an unknown SQLAlchemy database dialect: %s" % dialect )
-def init( file_path, url, engine_options={}, create_tables=False, database_query_profiling_proxy=False, object_store=None ):
+def init( file_path, url, engine_options={}, create_tables=False, database_query_profiling_proxy=False, object_store=None, trace_logger=None ):
"""Connect mappings to the database"""
# Connect dataset to the file path
Dataset.file_path = file_path
@@ -1962,6 +1962,10 @@
if database_query_profiling_proxy:
import galaxy.model.orm.logging_connection_proxy as logging_connection_proxy
proxy = logging_connection_proxy.LoggingProxy()
+ # If metlog is enabled, do micrologging
+ elif trace_logger:
+ import galaxy.model.orm.logging_connection_proxy as logging_connection_proxy
+ proxy = logging_connection_proxy.TraceLoggerProxy( trace_logger )
else:
proxy = None
# Create the database engine
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/model/migrate/versions/0108_add_extended_metadata.py
--- a/lib/galaxy/model/migrate/versions/0108_add_extended_metadata.py
+++ b/lib/galaxy/model/migrate/versions/0108_add_extended_metadata.py
@@ -61,13 +61,20 @@
def downgrade():
metadata.reflect()
- ExtendedMetadata_table.drop()
- ExtendedMetadataIndex_table.drop()
+ try:
+ ExtendedMetadataIndex_table.drop()
+ except Exception, e:
+ log.debug( "Dropping 'extended_metadata_index' table failed: %s" % ( str( e ) ) )
+
+ try:
+ ExtendedMetadata_table.drop()
+ except Exception, e:
+ log.debug( "Dropping 'extended_metadata' table failed: %s" % ( str( e ) ) )
- # Drop the Job table's exit_code column.
+ # Drop the LDDA table's extended metadata ID column.
try:
- job_table = Table( "library_dataset_dataset_association", metadata, autoload=True )
- extended_metadata_id = job_table.c.extended_metadata_id
+ ldda_table = Table( "library_dataset_dataset_association", metadata, autoload=True )
+ extended_metadata_id = ldda_table.c.extended_metadata_id
extended_metadata_id.drop()
except Exception, e:
log.debug( "Dropping 'extended_metadata_id' column from library_dataset_dataset_association table failed: %s" % ( str( e ) ) )
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/model/orm/logging_connection_proxy.py
--- a/lib/galaxy/model/orm/logging_connection_proxy.py
+++ b/lib/galaxy/model/orm/logging_connection_proxy.py
@@ -18,13 +18,31 @@
rval = []
for frame, fname, line, funcname, _, _ in inspect.stack()[2:]:
rval.append( "%s:%s@%d" % ( stripwd( fname ), funcname, line ) )
- return " > ".join( rval )
+ return rval
class LoggingProxy(ConnectionProxy):
+ """
+ Logs SQL statements using standard logging module
+ """
def cursor_execute(self, execute, cursor, statement, parameters, context, executemany):
start = time.clock()
rval = execute(cursor, statement, parameters, context)
duration = time.clock() - start
log.debug( "statement: %r parameters: %r executemany: %r duration: %r stack: %r",
- statement, parameters, executemany, duration, pretty_stack() )
+ statement, parameters, executemany, duration, " > ".join( pretty_stack() ) )
return rval
+
+class TraceLoggerProxy(ConnectionProxy):
+ """
+ Logs SQL statements using a metlog client
+ """
+ def __init__( self, trace_logger ):
+ self.trace_logger = trace_logger
+ def cursor_execute(self, execute, cursor, statement, parameters, context, executemany):
+ start = time.clock()
+ rval = execute(cursor, statement, parameters, context)
+ duration = time.clock() - start
+ self.trace_logger.log( "sqlalchemy_query",
+ message="Query executed", statement=statement, parameters=parameters,
+ executemany=executemany, duration=duration )
+ return rval
\ No newline at end of file
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/tool_shed/install_manager.py
--- a/lib/galaxy/tool_shed/install_manager.py
+++ b/lib/galaxy/tool_shed/install_manager.py
@@ -284,7 +284,6 @@
relative_install_dir = os.path.join( relative_clone_dir, name )
install_dir = os.path.join( clone_dir, name )
ctx_rev = suc.get_ctx_rev( tool_shed_url, name, self.repository_owner, installed_changeset_revision )
- print "Adding new row (or updating an existing row) for repository '%s' in the tool_shed_repository table." % name
tool_shed_repository = suc.create_or_update_tool_shed_repository( app=self.app,
name=name,
description=description,
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/tool_shed/update_manager.py
--- a/lib/galaxy/tool_shed/update_manager.py
+++ b/lib/galaxy/tool_shed/update_manager.py
@@ -4,6 +4,7 @@
import threading, urllib2, logging
from galaxy.util import string_as_bool
import galaxy.util.shed_util as shed_util
+from galaxy.model.orm import and_
log = logging.getLogger( __name__ )
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -813,7 +813,7 @@
self.attributes['split_size'] = 20
self.attributes['split_mode'] = 'number_of_parts'
-class Tool:
+class Tool( object ):
"""
Represents a computational tool that can be executed through Galaxy.
"""
@@ -1600,9 +1600,11 @@
try:
possible_cases.remove( case.value )
except:
- log.warning( "A when tag has been defined for '%s (%s) --> %s', but does not appear to be selectable." % ( group.name, group.test_param.name, case.value ) )
+ log.warning( "Tool %s: a when tag has been defined for '%s (%s) --> %s', but does not appear to be selectable." %
+ ( self.id, group.name, group.test_param.name, case.value ) )
for unspecified_case in possible_cases:
- log.warning( "A when tag has not been defined for '%s (%s) --> %s', assuming empty inputs." % ( group.name, group.test_param.name, unspecified_case ) )
+ log.warning( "Tool %s: a when tag has not been defined for '%s (%s) --> %s', assuming empty inputs." %
+ ( self.id, group.name, group.test_param.name, unspecified_case ) )
case = ConditionalWhen()
case.value = unspecified_case
case.inputs = odict()
@@ -2128,16 +2130,16 @@
return params_to_strings( self.inputs, params, app )
def params_from_strings( self, params, app, ignore_errors=False ):
return params_from_strings( self.inputs, params, app, ignore_errors )
- def check_and_update_param_values( self, values, trans ):
+ def check_and_update_param_values( self, values, trans, update_values=True ):
"""
Check that all parameters have values, and fill in with default
values where necessary. This could be called after loading values
from a database in case new parameters have been added.
"""
messages = {}
- self.check_and_update_param_values_helper( self.inputs, values, trans, messages )
+ self.check_and_update_param_values_helper( self.inputs, values, trans, messages, update_values=update_values )
return messages
- def check_and_update_param_values_helper( self, inputs, values, trans, messages, context=None, prefix="" ):
+ def check_and_update_param_values_helper( self, inputs, values, trans, messages, context=None, prefix="", update_values=True ):
"""
Recursive helper for `check_and_update_param_values_helper`
"""
@@ -2181,7 +2183,13 @@
self.check_and_update_param_values_helper( input.cases[current].inputs, group_values, trans, messages, context, prefix )
else:
# Regular tool parameter, no recursion needed
- pass
+ try:
+ #this will fail when a parameter's type has changed to a non-compatible one: e.g. conditional group changed to dataset input
+ input.value_from_basic( input.value_to_basic( values[ input.name ], trans.app ), trans.app, ignore_errors=False )
+ except:
+ messages[ input.name ] = "Value no longer valid for '%s%s', replaced with default" % ( prefix, input.label )
+ if update_values:
+ values[ input.name ] = input.get_initial_value( trans, context )
def handle_unvalidated_param_values( self, input_values, app ):
"""
Find any instances of `UnvalidatedValue` within input_values and
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -747,9 +747,9 @@
return { "__class__": "RuntimeValue" }
return value
def value_from_basic( self, value, app, ignore_errors=False ):
- if isinstance( value, dict ) and value["__class__"] == "UnvalidatedValue":
+ if isinstance( value, dict ) and value.get( "__class__", None ) == "UnvalidatedValue":
return UnvalidatedValue( value["value"] )
- return super( SelectToolParameter, self ).value_from_basic( value, app )
+ return super( SelectToolParameter, self ).value_from_basic( value, app, ignore_errors=ignore_errors )
def need_late_validation( self, trans, context ):
"""
Determine whether we need to wait to validate this parameters value
@@ -943,7 +943,7 @@
if not isinstance( value, list ):
value = value.split( '\n' )
for column in value:
- for column2 in column.split( ',' ):
+ for column2 in str( column ).split( ',' ):
column2 = column2.strip()
if column2:
column_list.append( column2 )
@@ -1347,7 +1347,7 @@
rval = []
for val in value:
rval.append( get_option_display( val, self.options ) or val )
- return "\n".join( rval ) + suffix
+ return "\n".join( map( str, rval ) ) + suffix
def get_dependencies( self ):
"""
@@ -1586,8 +1586,11 @@
elif isinstance( value, list) and len(value) > 0 and isinstance( value[0], DummyDataset):
return None
elif isinstance( value, list ):
- return ",".join( [ val if isinstance( val, basestring ) else str(val.id) for val in value] )
- return value.id
+ return ",".join( [ str( self.to_string( val, app ) ) for val in value ] )
+ try:
+ return value.id
+ except:
+ return str( value )
def to_python( self, value, app ):
# Both of these values indicate that no dataset is selected. However, 'None'
@@ -1608,9 +1611,11 @@
if value and not isinstance( value, list ):
value = [ value ]
if value:
- return ", ".join( [ "%s: %s" % ( item.hid, item.name ) for item in value ] )
- else:
- return "No dataset"
+ try:
+ return ", ".join( [ "%s: %s" % ( item.hid, item.name ) for item in value ] )
+ except:
+ pass
+ return "No dataset"
def validate( self, value, history=None ):
for validator in self.validators:
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/tools/parameters/grouping.py
--- a/lib/galaxy/tools/parameters/grouping.py
+++ b/lib/galaxy/tools/parameters/grouping.py
@@ -68,21 +68,25 @@
return rval
def value_from_basic( self, value, app, ignore_errors=False ):
rval = []
- for i, d in enumerate( value ):
- rval_dict = {}
- # If the special __index__ key is not set, create it (for backward
- # compatibility)
- rval_dict['__index__'] = d.get( '__index__', i )
- # Restore child inputs
- for input in self.inputs.itervalues():
- if ignore_errors and input.name not in d:
- # If we do not have a value, and are ignoring errors, we simply
- # do nothing. There will be no value for the parameter in the
- # conditional's values dictionary.
- pass
- else:
- rval_dict[ input.name ] = input.value_from_basic( d[input.name], app, ignore_errors )
- rval.append( rval_dict )
+ try:
+ for i, d in enumerate( value ):
+ rval_dict = {}
+ # If the special __index__ key is not set, create it (for backward
+ # compatibility)
+ rval_dict['__index__'] = d.get( '__index__', i )
+ # Restore child inputs
+ for input in self.inputs.itervalues():
+ if ignore_errors and input.name not in d:
+ # If we do not have a value, and are ignoring errors, we simply
+ # do nothing. There will be no value for the parameter in the
+ # conditional's values dictionary.
+ pass
+ else:
+ rval_dict[ input.name ] = input.value_from_basic( d[input.name], app, ignore_errors )
+ rval.append( rval_dict )
+ except Exception, e:
+ if not ignore_errors:
+ raise e
return rval
def visit_inputs( self, prefix, value, callback ):
for i, d in enumerate( value ):
@@ -441,24 +445,28 @@
return rval
def value_from_basic( self, value, app, ignore_errors=False ):
rval = dict()
- current_case = rval['__current_case__'] = value['__current_case__']
- # Test param
- if ignore_errors and self.test_param.name not in value:
- # If ignoring errors, do nothing. However this is potentially very
- # problematic since if we are missing the value of test param,
- # the entire conditional is wrong.
- pass
- else:
- rval[ self.test_param.name ] = self.test_param.value_from_basic( value[ self.test_param.name ], app, ignore_errors )
- # Inputs associated with current case
- for input in self.cases[current_case].inputs.itervalues():
- if ignore_errors and input.name not in value:
- # If we do not have a value, and are ignoring errors, we simply
- # do nothing. There will be no value for the parameter in the
- # conditional's values dictionary.
+ try:
+ current_case = rval['__current_case__'] = value['__current_case__']
+ # Test param
+ if ignore_errors and self.test_param.name not in value:
+ # If ignoring errors, do nothing. However this is potentially very
+ # problematic since if we are missing the value of test param,
+ # the entire conditional is wrong.
pass
else:
- rval[ input.name ] = input.value_from_basic( value[ input.name ], app, ignore_errors )
+ rval[ self.test_param.name ] = self.test_param.value_from_basic( value[ self.test_param.name ], app, ignore_errors )
+ # Inputs associated with current case
+ for input in self.cases[current_case].inputs.itervalues():
+ if ignore_errors and input.name not in value:
+ # If we do not have a value, and are ignoring errors, we simply
+ # do nothing. There will be no value for the parameter in the
+ # conditional's values dictionary.
+ pass
+ else:
+ rval[ input.name ] = input.value_from_basic( value[ input.name ], app, ignore_errors )
+ except Exception, e:
+ if not ignore_errors:
+ raise e
return rval
def visit_inputs( self, prefix, value, callback ):
current_case = value['__current_case__']
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/tools/parameters/validation.py
--- a/lib/galaxy/tools/parameters/validation.py
+++ b/lib/galaxy/tools/parameters/validation.py
@@ -182,10 +182,13 @@
def from_element( cls, param, elem ):
return cls( message=elem.get( 'message', None ), check=elem.get( 'check', "" ), skip=elem.get( 'skip', "" ) )
def validate( self, value, history=None ):
- if value and value.missing_meta( check = self.check, skip = self.skip ):
- if self.message is None:
- self.message = "Metadata missing, click the pencil icon in the history item to edit / save the metadata attributes"
- raise ValueError( self.message )
+ if value:
+ if not isinstance( value, model.DatasetInstance ):
+ raise ValueError( 'A non-dataset value was provided.' )
+ if value.missing_meta( check = self.check, skip = self.skip ):
+ if self.message is None:
+ self.message = "Metadata missing, click the pencil icon in the history item to edit / save the metadata attributes"
+ raise ValueError( self.message )
class UnspecifiedBuildValidator( Validator ):
"""
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/tools/test.py
--- a/lib/galaxy/tools/test.py
+++ b/lib/galaxy/tools/test.py
@@ -35,11 +35,13 @@
value = new_value
break
if not found_parameter:
- raise ValueError( "Unable to determine parameter type of test input '%s'. Ensure that the parameter exists and that any container groups are defined first." % name )
+ raise ValueError( "Unable to determine parameter type of test input '%s'. "
+ "Ensure that the parameter exists and that any container groups are defined first."
+ % name )
elif isinstance( self.tool.inputs[name], basic.DataToolParameter ):
value = self.__add_uploaded_dataset( name, value, extra, self.tool.inputs[name] )
except Exception, e:
- log.debug( "Error in add_param for %s: %s" % ( name, e ) )
+ log.debug( "Error for tool %s: could not add test parameter %s. %s" % ( self.tool.id, name, e ) )
self.inputs.append( ( name, value, extra ) )
def add_output( self, name, file, extra ):
self.outputs.append( ( name, file, extra ) )
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -2,7 +2,7 @@
Utility functions used systemwide.
"""
-import logging, threading, random, string, re, binascii, pickle, time, datetime, math, re, os, sys, tempfile, stat, grp, smtplib
+import logging, threading, random, string, re, binascii, pickle, time, datetime, math, re, os, sys, tempfile, stat, grp, smtplib, errno
from email.MIMEText import MIMEText
# Older py compatibility
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/util/hash_util.py
--- a/lib/galaxy/util/hash_util.py
+++ b/lib/galaxy/util/hash_util.py
@@ -32,3 +32,10 @@
def hmac_new( key, value ):
return hmac.new( key, value, sha ).hexdigest()
+
+def is_hashable( value ):
+ try:
+ hash( value )
+ except:
+ return False
+ return True
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/util/log/__init__.py
--- /dev/null
+++ b/lib/galaxy/util/log/__init__.py
@@ -0,0 +1,5 @@
+class TraceLogger( object ):
+ def __init__( self, name ):
+ self.name = name
+ def log( **kwargs ):
+ raise TypeError( "Abstract Method" )
\ No newline at end of file
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/util/log/fluent_log.py
--- /dev/null
+++ b/lib/galaxy/util/log/fluent_log.py
@@ -0,0 +1,39 @@
+"""
+Provides a `TraceLogger` implementation that logs to a fluentd collector
+"""
+
+import time
+import threading
+
+import galaxy.eggs
+galaxy.eggs.require( "fluent-logger" )
+galaxy.eggs.require( "msgpack_python" )
+
+from fluent.sender import FluentSender
+
+
+class FluentTraceLogger( object ):
+ def __init__( self, name, host='localhost', port=24224 ):
+ self.lock = threading.Lock()
+ self.thread_local = threading.local()
+ self.name = name
+ self.sender = FluentSender( self.name, host=host, port=port )
+
+ def context_set( self, key, value ):
+ self.lock.acquire()
+ if not hasattr( self.thread_local, 'context' ):
+ self.thread_local.context = {}
+ self.thread_local.context[key] = value
+ self.lock.release()
+
+ def context_remove( self, key ):
+ self.lock.acquire()
+ del self.thread_local.context[key]
+ self.lock.release()
+
+ def log( self, label, **kwargs ):
+ self.lock.acquire()
+ if hasattr( self.thread_local, 'context' ):
+ kwargs.update( self.thread_local.context )
+ self.lock.release()
+ self.sender.emit_with_time( label, int(time.time()), kwargs )
\ No newline at end of file
diff -r 7b916e959c09083e541d1daaf573b1af995ed152 -r d603df6f2e7978f9c75f64190bfba97bbf59e851 lib/galaxy/util/shed_util.py
--- a/lib/galaxy/util/shed_util.py
+++ b/lib/galaxy/util/shed_util.py
@@ -1,10 +1,11 @@
-import os, tempfile, shutil, logging, urllib2
+import os, tempfile, shutil, logging, urllib2, threading
from galaxy.datatypes import checkers
from galaxy.web import url_for
from galaxy import util
-from galaxy.util.json import from_json_string, to_json_string
+from galaxy.util import json
from galaxy.webapps.community.util import container_util
import shed_util_common as suc
+import galaxy.tools
from galaxy.tools.search import ToolBoxSearch
from galaxy.tool_shed.tool_dependencies.install_util import create_or_update_tool_dependency, install_package, set_environment
from galaxy.tool_shed import encoding_util
@@ -22,6 +23,38 @@
log = logging.getLogger( __name__ )
+def activate_repository( trans, repository ):
+ repository_clone_url = suc.generate_clone_url_for_installed_repository( trans.app, repository )
+ shed_tool_conf, tool_path, relative_install_dir = suc.get_tool_panel_config_tool_path_install_dir( trans.app, repository )
+ repository.deleted = False
+ repository.status = trans.model.ToolShedRepository.installation_status.INSTALLED
+ if repository.includes_tools:
+ metadata = repository.metadata
+ repository_tools_tups = suc.get_repository_tools_tups( trans.app, metadata )
+ # Reload tools into the appropriate tool panel section.
+ tool_panel_dict = repository.metadata[ 'tool_panel_section' ]
+ add_to_tool_panel( trans.app,
+ repository.name,
+ repository_clone_url,
+ repository.installed_changeset_revision,
+ repository_tools_tups,
+ repository.owner,
+ shed_tool_conf,
+ tool_panel_dict,
+ new_install=False )
+ trans.sa_session.add( repository )
+ trans.sa_session.flush()
+ if repository.includes_datatypes:
+ if tool_path:
+ repository_install_dir = os.path.abspath ( os.path.join( tool_path, relative_install_dir ) )
+ else:
+ repository_install_dir = os.path.abspath ( relative_install_dir )
+ # Activate proprietary datatypes.
+ installed_repository_dict = load_installed_datatypes( trans.app, repository, repository_install_dir, deactivate=False )
+ if installed_repository_dict and 'converter_path' in installed_repository_dict:
+ load_installed_datatype_converters( trans.app, installed_repository_dict, deactivate=False )
+ if installed_repository_dict and 'display_path' in installed_repository_dict:
+ load_installed_display_applications( trans.app, installed_repository_dict, deactivate=False )
def add_to_shed_tool_config( app, shed_tool_conf_dict, elem_list ):
# A tool shed repository is being installed so change the shed_tool_conf file. Parse the config file to generate the entire list
# of config_elems instead of using the in-memory list since it will be a subset of the entire list if one or more repositories have
@@ -175,66 +208,130 @@
# Attempt to ensure we're copying an appropriate file.
if is_data_index_sample_file( filename ):
suc.copy_sample_file( app, filename, dest_path=dest_path )
-def create_repository_dependency_objects( trans, tool_path, tool_shed_url, repo_info_dicts, reinstalling=False ):
+def create_repository_dependency_objects( trans, tool_path, tool_shed_url, repo_info_dicts, reinstalling=False, install_repository_dependencies=False,
+ no_changes_checked=False, tool_panel_section=None, new_tool_panel_section=None ):
"""
Discover all repository dependencies and make sure all tool_shed_repository and associated repository_dependency records exist as well as
the dependency relationships between installed repositories. This method is called when new repositories are being installed into a Galaxy
instance and when uninstalled repositories are being reinstalled.
"""
message = ''
+ # The following list will be maintained within this method to contain all created or updated tool shed repositories, including repository dependencies
+ # that may not be installed.
+ all_created_or_updated_tool_shed_repositories = []
+ # There will be a one-to-one mapping between items in 3 lists: created_or_updated_tool_shed_repositories, tool_panel_section_keys and filtered_repo_info_dicts.
+ # The 3 lists will filter out repository dependencies that are not to be installed.
created_or_updated_tool_shed_repositories = []
- # Repositories will be filtered (e.g., if already installed, etc), so filter the associated repo_info_dicts accordingly.
+ tool_panel_section_keys = []
+ # Repositories will be filtered (e.g., if already installed, if elected to not be installed, etc), so filter the associated repo_info_dicts accordingly.
filtered_repo_info_dicts = []
- # Discover all repository dependencies and retrieve information for installing them.
+ # Discover all repository dependencies and retrieve information for installing them. Even if the user elected to not install repository dependencies we have
+ # to make sure all repository dependency objects exist so that the appropriate repository dependency relationships can be built.
all_repo_info_dicts = get_required_repo_info_dicts( tool_shed_url, repo_info_dicts )
+ if not all_repo_info_dicts:
+ # No repository dependencies were discovered so process the received repositories.
+ all_repo_info_dicts = [ rid for rid in repo_info_dicts ]
for repo_info_dict in all_repo_info_dicts:
for name, repo_info_tuple in repo_info_dict.items():
description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, tool_dependencies = \
suc.get_repo_info_tuple_contents( repo_info_tuple )
- clone_dir = os.path.join( tool_path, generate_tool_path( repository_clone_url, changeset_revision ) )
- relative_install_dir = os.path.join( clone_dir, name )
# Make sure the repository was not already installed.
- installed_tool_shed_repository, installed_changeset_revision = \
- repository_was_previously_installed( trans, tool_shed_url, name, repo_info_tuple, clone_dir )
+ installed_tool_shed_repository, installed_changeset_revision = repository_was_previously_installed( trans, tool_shed_url, name, repo_info_tuple )
if installed_tool_shed_repository:
- if reinstalling:
- if installed_tool_shed_repository.status in [ trans.model.ToolShedRepository.installation_status.ERROR,
- trans.model.ToolShedRepository.installation_status.UNINSTALLED ]:
- can_update = True
- name = installed_tool_shed_repository.name
- description = installed_tool_shed_repository.description
- installed_changeset_revision = installed_tool_shed_repository.installed_changeset_revision
- metadata_dict = installed_tool_shed_repository.metadata
- dist_to_shed = installed_tool_shed_repository.dist_to_shed
+ tool_section, new_tool_panel_section, tool_panel_section_key = handle_tool_panel_selection( trans=trans,
+ metadata=installed_tool_shed_repository.metadata,
+ no_changes_checked=no_changes_checked,
+ tool_panel_section=tool_panel_section,
+ new_tool_panel_section=new_tool_panel_section )
+ if reinstalling or install_repository_dependencies:
+ # If the user elected to install repository dependencies, all items in the all_repo_info_dicts list will be processed. However, if
+ # repository dependencies are not to be installed, only those items contained in the received repo_info_dicts list will be processed.
+ if is_in_repo_info_dicts( repo_info_dict, repo_info_dicts ) or install_repository_dependencies:
+ if installed_tool_shed_repository.status in [ trans.model.ToolShedRepository.installation_status.ERROR,
+ trans.model.ToolShedRepository.installation_status.UNINSTALLED ]:
+ # The current tool shed repository is not currently installed, so we can update it's record in the database.
+ can_update = True
+ name = installed_tool_shed_repository.name
+ description = installed_tool_shed_repository.description
+ installed_changeset_revision = installed_tool_shed_repository.installed_changeset_revision
+ metadata_dict = installed_tool_shed_repository.metadata
+ dist_to_shed = installed_tool_shed_repository.dist_to_shed
+ elif installed_tool_shed_repository.status in [ trans.model.ToolShedRepository.installation_status.DEACTIVATED ]:
+ # The current tool shed repository is deactivated, so updating it's database record is not necessary - just activate it.
+ activate_repository( trans, installed_tool_shed_repository )
+ can_update = False
+ else:
+ # The tool shed repository currently being processed is already installed or is in the process of being installed, so it's record
+ # in the database cannot be updated.
+ can_update = False
else:
- # There is a repository already installed which is a dependency of the repository being reinstalled.
+ # This block will be reached only if reinstalling is True, install_repository_dependencies is False and is_in_repo_info_dicts is False.
+ # The tool shed repository currently being processed must be a repository dependency that the user elected to not install, so it's
+ # record in the database cannot be updated.
can_update = False
else:
- # An attempt is being made to install a tool shed repository into a Galaxy instance when the same repository was previously installed.
- message += "Revision <b>%s</b> of tool shed repository <b>%s</b> owned by <b>%s</b> " % ( changeset_revision, name, repository_owner )
- if installed_changeset_revision != changeset_revision:
- message += "was previously installed using changeset revision <b>%s</b>. " % installed_changeset_revision
+ # This block will be reached only if reinstalling is False and install_repository_dependencies is False. This implies that the tool shed
+ # repository currently being processed has already been installed.
+ if len( all_repo_info_dicts ) == 1:
+ # If only a single repository is being installed, return an informative message to the user.
+ message += "Revision <b>%s</b> of tool shed repository <b>%s</b> owned by <b>%s</b> " % ( changeset_revision, name, repository_owner )
+ if installed_changeset_revision != changeset_revision:
+ message += "was previously installed using changeset revision <b>%s</b>. " % installed_changeset_revision
+ else:
+ message += "was previously installed. "
+ if installed_tool_shed_repository.uninstalled:
+ message += "The repository has been uninstalled, however, so reinstall the original repository instead of installing it again. "
+ elif installed_tool_shed_repository.deleted:
+ message += "The repository has been deactivated, however, so activate the original repository instead of installing it again. "
+ if installed_changeset_revision != changeset_revision:
+ message += "You can get the latest updates for the repository using the <b>Get updates</b> option from the repository's "
+ message += "<b>Repository Actions</b> pop-up menu. "
+ created_or_updated_tool_shed_repositories.append( installed_tool_shed_repository )
+ tool_panel_section_keys.append( tool_panel_section_key )
+ return created_or_updated_tool_shed_repositories, tool_panel_section_keys, all_repo_info_dicts, filtered_repo_info_dicts, message
else:
- message += "was previously installed. "
- if installed_tool_shed_repository.uninstalled:
- message += "The repository has been uninstalled, however, so reinstall the original repository instead of installing it again. "
- elif installed_tool_shed_repository.deleted:
- message += "The repository has been deactivated, however, so activate the original repository instead of installing it again. "
- if installed_changeset_revision != changeset_revision:
- message += "You can get the latest updates for the repository using the <b>Get updates</b> option from the repository's "
- message += "<b>Repository Actions</b> pop-up menu. "
- if len( repo_info_dicts ) == 1:
- created_or_updated_tool_shed_repositories.append( installed_tool_shed_repository )
- return created_or_updated_tool_shed_repositories, all_repo_info_dicts, filtered_repo_info_dicts, message
+ # We're in the process of installing multiple tool shed repositories into Galaxy. Since the repository currently being processed
+ # has already been installed, skip it and process the next repository in the list.
+ can_update = False
else:
- # A tool shed repository is being installed into a Galaxy instance for the first time. We may have the case where a repository
- # is being reinstalled where because the repository being newly installed here may be a dependency of the repository being reinstalled.
+ # A tool shed repository is being installed into a Galaxy instance for the first time, or we're attempting to install it or reinstall it resulted
+ # in an error. In the latter case, the repository record in the database has no metadata and it's status has been set to 'New'. In either case,
+ # the repository's database record may be updated.
can_update = True
installed_changeset_revision = changeset_revision
- metadata_dict={}
+ metadata_dict = {}
dist_to_shed = False
if can_update:
- log.debug( "Adding new row (or updating an existing row) for repository '%s' in the tool_shed_repository table." % name )
+ # The database record for the tool shed repository currently being processed can be updated.
+ if reinstalling or install_repository_dependencies:
+ # Get the repository metadata to see where it was previously located in the tool panel.
+ if installed_tool_shed_repository:
+ # The tool shed repository status is one of 'New', 'Uninstalled', or 'Error'.
+ tool_section, new_tool_panel_section, tool_panel_section_key = \
+ handle_tool_panel_selection( trans=trans,
+ metadata=installed_tool_shed_repository.metadata,
+ no_changes_checked=no_changes_checked,
+ tool_panel_section=tool_panel_section,
+ new_tool_panel_section=new_tool_panel_section )
+ else:
+ # We're installing a new tool shed repository that does not yet have a database record. This repository is a repository dependency
+ # of a different repository being installed.
+ if new_tool_panel_section:
+ section_id = new_tool_panel_section.lower().replace( ' ', '_' )
+ tool_panel_section_key = 'section_%s' % str( section_id )
+ elif tool_panel_section:
+ tool_panel_section_key = 'section_%s' % tool_panel_section
+ else:
+ tool_panel_section_key = None
+ else:
+ # We're installing a new tool shed repository that does not yet have a database record.
+ if new_tool_panel_section:
+ section_id = new_tool_panel_section.lower().replace( ' ', '_' )
+ tool_panel_section_key = 'section_%s' % str( section_id )
+ elif tool_panel_section:
+ tool_panel_section_key = 'section_%s' % tool_panel_section
+ else:
+ tool_panel_section_key = None
tool_shed_repository = suc.create_or_update_tool_shed_repository( app=trans.app,
name=name,
description=description,
@@ -246,9 +343,17 @@
current_changeset_revision=changeset_revision,
owner=repository_owner,
dist_to_shed=False )
- created_or_updated_tool_shed_repositories.append( tool_shed_repository )
- filtered_repo_info_dicts.append( encoding_util.tool_shed_encode( repo_info_dict ) )
- return created_or_updated_tool_shed_repositories, all_repo_info_dicts, filtered_repo_info_dicts, message
+ # Add the processed tool shed repository to the list of all processed repositories maintained within this method.
+ all_created_or_updated_tool_shed_repositories.append( tool_shed_repository )
+ # Only append the tool shed repository to the list of created_or_updated_tool_shed_repositories if it is supposed to be installed.
+ if install_repository_dependencies or is_in_repo_info_dicts( repo_info_dict, repo_info_dicts ):
+ # Keep the one-to-one mapping between items in 3 lists.
+ created_or_updated_tool_shed_repositories.append( tool_shed_repository )
+ tool_panel_section_keys.append( tool_panel_section_key )
+ filtered_repo_info_dicts.append( repo_info_dict )
+ # Build repository dependency relationships even if the user chose to not install repository dependencies.
+ suc.build_repository_dependency_relationships( trans, all_repo_info_dicts, all_created_or_updated_tool_shed_repositories )
+ return created_or_updated_tool_shed_repositories, tool_panel_section_keys, all_repo_info_dicts, filtered_repo_info_dicts, message
def create_repository_dict_for_proprietary_datatypes( tool_shed, name, owner, installed_changeset_revision, tool_dicts, converter_path=None, display_path=None ):
return dict( tool_shed=tool_shed,
repository_name=name,
@@ -476,6 +581,66 @@
if converter_path and display_path:
break
return converter_path, display_path
+def get_dependencies_for_repository( trans, tool_shed_url, repo_info_dict, includes_tool_dependencies ):
+ """
+ Return dictionaries containing the sets of installed and missing tool dependencies and repository dependencies associated with the repository defined
+ by the received repo_info_dict.
+ """
+ name = repo_info_dict.keys()[ 0 ]
+ repo_info_tuple = repo_info_dict[ name ]
+ description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, installed_td = \
+ suc.get_repo_info_tuple_contents( repo_info_tuple )
+ if repository_dependencies:
+ missing_td = {}
+ # Handle the scenario where a repository was installed, then uninstalled and an error occurred during the reinstallation process.
+ # In this case, a record for the repository will exist in the database with the status of 'New'.
+ repository = suc.get_repository_for_dependency_relationship( trans.app, tool_shed_url, name, repository_owner, changeset_revision )
+ if repository and repository.metadata:
+ installed_rd, missing_rd = get_installed_and_missing_repository_dependencies( trans, repository )
+ else:
+ installed_rd, missing_rd = get_installed_and_missing_repository_dependencies_for_new_install( trans, repo_info_tuple )
+ # Discover all repository dependencies and retrieve information for installing them.
+ required_repo_info_dicts = get_required_repo_info_dicts( tool_shed_url, util.listify( repo_info_dict ) )
+ # Display tool dependencies defined for each of the repository dependencies.
+ if required_repo_info_dicts:
+ all_tool_dependencies = {}
+ for rid in required_repo_info_dicts:
+ for name, repo_info_tuple in rid.items():
+ description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, rid_installed_td = \
+ suc.get_repo_info_tuple_contents( repo_info_tuple )
+ if rid_installed_td:
+ for td_key, td_dict in rid_installed_td.items():
+ if td_key not in all_tool_dependencies:
+ all_tool_dependencies[ td_key ] = td_dict
+ if all_tool_dependencies:
+ if installed_td is None:
+ installed_td = {}
+ else:
+ # Move all tool dependencies to the missing_tool_dependencies container.
+ for td_key, td_dict in installed_td.items():
+ if td_key not in missing_td:
+ missing_td[ td_key ] = td_dict
+ installed_td = {}
+ # Discover and categorize all tool dependencies defined for this repository's repository dependencies.
+ required_tool_dependencies, required_missing_tool_dependencies = \
+ get_installed_and_missing_tool_dependencies_for_new_install( trans, all_tool_dependencies )
+ if required_tool_dependencies:
+ if not includes_tool_dependencies:
+ includes_tool_dependencies = True
+ for td_key, td_dict in required_tool_dependencies.items():
+ if td_key not in installed_td:
+ installed_td[ td_key ] = td_dict
+ if required_missing_tool_dependencies:
+ if not includes_tool_dependencies:
+ includes_tool_dependencies = True
+ for td_key, td_dict in required_missing_tool_dependencies.items():
+ if td_key not in missing_td:
+ missing_td[ td_key ] = td_dict
+ else:
+ installed_rd = None
+ missing_rd = None
+ missing_td = None
+ return name, repository_owner, changeset_revision, includes_tool_dependencies, installed_rd, missing_rd, installed_td, missing_td
def get_headers( fname, sep, count=60, is_multi_byte=False ):
"""Returns a list with the first 'count' lines split by 'sep'."""
headers = []
@@ -489,9 +654,14 @@
break
return headers
def get_installed_and_missing_repository_dependencies( trans, repository ):
+ """
+ Return the installed and missing repository dependencies for a tool shed repository that has a record in the Galaxy database, but
+ may or may not be installed. In this case, the repository dependencies are associated with the repository in the database.
+ """
missing_repository_dependencies = {}
installed_repository_dependencies = {}
- if repository.has_repository_dependencies:
+ has_repository_dependencies = repository.has_repository_dependencies
+ if has_repository_dependencies:
metadata = repository.metadata
installed_rd_tups = []
missing_rd_tups = []
@@ -522,6 +692,50 @@
missing_repository_dependencies[ root_key ] = missing_rd_tups
missing_repository_dependencies[ 'description' ] = description
return installed_repository_dependencies, missing_repository_dependencies
+def get_installed_and_missing_repository_dependencies_for_new_install( trans, repo_info_tuple ):
+ """
+ Parse the received repository_dependencies dictionary that is associated with a repository being installed into Galaxy for the first time
+ and attempt to determine repository dependencies that are already installed and those that are not.
+ """
+ missing_repository_dependencies = {}
+ installed_repository_dependencies = {}
+ missing_rd_tups = []
+ installed_rd_tups = []
+ description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, installed_td = \
+ suc.get_repo_info_tuple_contents( repo_info_tuple )
+ if repository_dependencies:
+ description = repository_dependencies[ 'description' ]
+ root_key = repository_dependencies[ 'root_key' ]
+ # The repository dependencies container will include only the immediate repository dependencies of this repository, so the container will be
+ # only a single level in depth.
+ for key, rd_tups in repository_dependencies.items():
+ if key in [ 'description', 'root_key' ]:
+ continue
+ for rd_tup in rd_tups:
+ tool_shed, name, owner, changeset_revision = rd_tup
+ # Updates to installed repository revisions may have occurred, so make sure to locate the appropriate repository revision if one exists.
+ repository, current_changeset_revision = repository_was_previously_installed( trans, tool_shed, name, repo_info_tuple )
+ if repository:
+ new_rd_tup = [ tool_shed, name, owner, changeset_revision, repository.id, repository.status ]
+ if repository.status == trans.model.ToolShedRepository.installation_status.INSTALLED:
+ if new_rd_tup not in installed_rd_tups:
+ installed_rd_tups.append( new_rd_tup )
+ else:
+ if new_rd_tup not in missing_rd_tups:
+ missing_rd_tups.append( new_rd_tup )
+ else:
+ new_rd_tup = [ tool_shed, name, owner, changeset_revision, None, 'Never installed' ]
+ if new_rd_tup not in missing_rd_tups:
+ missing_rd_tups.append( new_rd_tup )
+ if installed_rd_tups:
+ installed_repository_dependencies[ 'root_key' ] = root_key
+ installed_repository_dependencies[ root_key ] = installed_rd_tups
+ installed_repository_dependencies[ 'description' ] = description
+ if missing_rd_tups:
+ missing_repository_dependencies[ 'root_key' ] = root_key
+ missing_repository_dependencies[ root_key ] = missing_rd_tups
+ missing_repository_dependencies[ 'description' ] = description
+ return installed_repository_dependencies, missing_repository_dependencies
def get_installed_and_missing_tool_dependencies( trans, repository, all_tool_dependencies ):
if all_tool_dependencies:
tool_dependencies = {}
@@ -536,7 +750,11 @@
if tool_dependency:
td_info_dict[ 'repository_id' ] = repository.id
td_info_dict[ 'tool_dependency_id' ] = tool_dependency.id
- td_info_dict[ 'status' ] = str( tool_dependency.status )
+ if tool_dependency.status:
+ tool_dependency_status = str( tool_dependency.status )
+ else:
+ tool_dependency_status = 'Never installed'
+ td_info_dict[ 'status' ] = tool_dependency_status
val[ index ] = td_info_dict
if tool_dependency.status == trans.model.ToolDependency.installation_status.INSTALLED:
tool_dependencies[ td_key ] = val
@@ -550,7 +768,11 @@
if tool_dependency:
val[ 'repository_id' ] = repository.id
val[ 'tool_dependency_id' ] = tool_dependency.id
- val[ 'status' ] = str( tool_dependency.status )
+ if tool_dependency.status:
+ tool_dependency_status = str( tool_dependency.status )
+ else:
+ tool_dependency_status = 'Never installed'
+ val[ 'status' ] = tool_dependency_status
if tool_dependency.status == trans.model.ToolDependency.installation_status.INSTALLED:
tool_dependencies[ td_key ] = val
else:
@@ -559,6 +781,45 @@
tool_dependencies = None
missing_tool_dependencies = None
return tool_dependencies, missing_tool_dependencies
+def get_installed_and_missing_tool_dependencies_for_new_install( trans, all_tool_dependencies ):
+ """Return the lists of installed tool dependencies and missing tool dependencies for a set of repositories being installed into Galaxy."""
+ # FIXME: this method currently populates and returns only missing tool dependencies since tool dependencies defined for complex repository dependency
+ # relationships is not currently supported. This method should be enhanced to search for installed tool dependencies defined as complex repository
+ # dependency relationships when that feature is implemented.
+ if all_tool_dependencies:
+ tool_dependencies = {}
+ missing_tool_dependencies = {}
+ for td_key, val in all_tool_dependencies.items():
+ # Set environment tool dependencies are a list, set each member to never installed.
+ if td_key == 'set_environment':
+ new_val = []
+ for requirement_dict in val:
+ requirement_dict[ 'status' ] = trans.model.ToolDependency.installation_status.NEVER_INSTALLED
+ new_val.append( requirement_dict )
+ missing_tool_dependencies[ td_key ] = new_val
+ else:
+ # Since we have a new install, missing tool dependencies have never been installed.
+ val[ 'status' ] = trans.model.ToolDependency.installation_status.NEVER_INSTALLED
+ missing_tool_dependencies[ td_key ] = val
+ else:
+ tool_dependencies = None
+ missing_tool_dependencies = None
+ return tool_dependencies, missing_tool_dependencies
+def get_readme_files_dict_for_display( trans, tool_shed_url, repo_info_dict ):
+ """Return a dictionary of README files contained in the single repository being installed so they can be displayed on the tool panel section selection page."""
+ name = repo_info_dict.keys()[ 0 ]
+ repo_info_tuple = repo_info_dict[ name ]
+ description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, installed_td = \
+ suc.get_repo_info_tuple_contents( repo_info_tuple )
+ # Handle README files.
+ url = suc.url_join( tool_shed_url,
+ 'repository/get_readme_files?name=%s&owner=%s&changeset_revision=%s' % \
+ ( name, repository_owner, changeset_revision ) )
+ response = urllib2.urlopen( url )
+ raw_text = response.read()
+ response.close()
+ readme_files_dict = json.from_json_string( raw_text )
+ return readme_files_dict
def get_repository_owner( cleaned_repository_url ):
items = cleaned_repository_url.split( 'repos' )
repo_path = items[ 1 ]
@@ -573,10 +834,10 @@
"""
Inspect the list of repo_info_dicts for repository dependencies and append a repo_info_dict for each of them to the list. All
repository_dependencies entries in each of the received repo_info_dicts includes all required repositories, so only one pass through
- this methid is required to retrieve all repository dependencies.
+ this method is required to retrieve all repository dependencies.
"""
+ all_repo_info_dicts = []
if repo_info_dicts:
- all_repo_info_dicts = [ rid for rid in repo_info_dicts ]
# We'll send tuples of ( tool_shed, repository_name, repository_owner, changeset_revision ) to the tool shed to discover repository ids.
required_repository_tups = []
for repo_info_dict in repo_info_dicts:
@@ -606,7 +867,7 @@
text = response.read()
response.close()
if text:
- required_repo_info_dict = from_json_string( text )
+ required_repo_info_dict = json.from_json_string( text )
required_repo_info_dicts = []
encoded_dict_strings = required_repo_info_dict[ 'repo_info_dicts' ]
for encoded_dict_str in encoded_dict_strings:
@@ -690,12 +951,16 @@
update_dict = encoding_util.tool_shed_decode( encoded_update_dict )
changeset_revision = update_dict[ 'changeset_revision' ]
ctx_rev = update_dict[ 'ctx_rev' ]
+ includes_tools = update_dict.get( 'includes_tools', False )
+ has_repository_dependencies = update_dict.get( 'has_repository_dependencies', False )
response.close()
except Exception, e:
log.debug( "Error getting change set revision for update from the tool shed for repository '%s': %s" % ( repository.name, str( e ) ) )
+ includes_tools = False
+ has_repository_dependencies = False
changeset_revision = None
ctx_rev = None
- return changeset_revision, ctx_rev
+ return changeset_revision, ctx_rev, includes_tools, has_repository_dependencies
def handle_missing_data_table_entry( app, relative_install_dir, tool_path, repository_tools_tups ):
"""
Inspect each tool to see if any have input parameters that are dynamically generated select lists that require entries in the
@@ -783,6 +1048,60 @@
app.model.ToolDependency.installation_status.ERROR ]:
installed_tool_dependencies.append( tool_dependency )
return installed_tool_dependencies
+def handle_tool_panel_selection( trans, metadata, no_changes_checked, tool_panel_section, new_tool_panel_section ):
+ """Handle the selected tool panel location for loading tools included in tool shed repositories when installing or reinstalling them."""
+ # Get the location in the tool panel in which each tool was originally loaded.
+ tool_section = None
+ tool_panel_section_key = None
+ if 'tools' in metadata:
+ if 'tool_panel_section' in metadata:
+ tool_panel_dict = metadata[ 'tool_panel_section' ]
+ if not tool_panel_dict:
+ tool_panel_dict = generate_tool_panel_dict_for_new_install( metadata[ 'tools' ] )
+ else:
+ tool_panel_dict = generate_tool_panel_dict_for_new_install( metadata[ 'tools' ] )
+ # This forces everything to be loaded into the same section (or no section) in the tool panel.
+ tool_section_dicts = tool_panel_dict[ tool_panel_dict.keys()[ 0 ] ]
+ tool_section_dict = tool_section_dicts[ 0 ]
+ original_section_id = tool_section_dict[ 'id' ]
+ original_section_name = tool_section_dict[ 'name' ]
+ if no_changes_checked:
+ if original_section_id:
+ tool_panel_section_key = 'section_%s' % str( original_section_id )
+ if tool_panel_section_key in trans.app.toolbox.tool_panel:
+ tool_section = trans.app.toolbox.tool_panel[ tool_panel_section_key ]
+ else:
+ # The section in which the tool was originally loaded used to be in the tool panel, but no longer is.
+ elem = Element( 'section' )
+ elem.attrib[ 'name' ] = original_section_name
+ elem.attrib[ 'id' ] = original_section_id
+ elem.attrib[ 'version' ] = ''
+ tool_section = galaxy.tools.ToolSection( elem )
+ trans.app.toolbox.tool_panel[ tool_panel_section_key ] = tool_section
+ else:
+ # The user elected to change the tool panel section to contain the tools.
+ if new_tool_panel_section:
+ section_id = new_tool_panel_section.lower().replace( ' ', '_' )
+ tool_panel_section_key = 'section_%s' % str( section_id )
+ if tool_panel_section_key in trans.app.toolbox.tool_panel:
+ # Appending a tool to an existing section in trans.app.toolbox.tool_panel
+ log.debug( "Appending to tool panel section: %s" % new_tool_panel_section )
+ tool_section = trans.app.toolbox.tool_panel[ tool_panel_section_key ]
+ else:
+ # Appending a new section to trans.app.toolbox.tool_panel
+ log.debug( "Loading new tool panel section: %s" % new_tool_panel_section )
+ elem = Element( 'section' )
+ elem.attrib[ 'name' ] = new_tool_panel_section
+ elem.attrib[ 'id' ] = section_id
+ elem.attrib[ 'version' ] = ''
+ tool_section = galaxy.tools.ToolSection( elem )
+ trans.app.toolbox.tool_panel[ tool_panel_section_key ] = tool_section
+ elif tool_panel_section:
+ tool_panel_section_key = 'section_%s' % tool_panel_section
+ tool_section = trans.app.toolbox.tool_panel[ tool_panel_section_key ]
+ else:
+ tool_section = None
+ return tool_section, new_tool_panel_section, tool_panel_section_key
def handle_tool_versions( app, tool_version_dicts, tool_shed_repository ):
"""
Using the list of tool_version_dicts retrieved from the tool shed (one per changeset revison up to the currently installed changeset revision),
@@ -851,6 +1170,18 @@
return False
# Default to copying the file if none of the above are true.
return True
+def is_in_repo_info_dicts( repo_info_dict, repo_info_dicts ):
+ """Return True if the received repo_info_dict is contained in the list of received repo_info_dicts."""
+ for name, repo_info_tuple in repo_info_dict.items():
+ for rid in repo_info_dicts:
+ for rid_name, rid_repo_info_tuple in rid.items():
+ if rid_name == name:
+ if len( rid_repo_info_tuple ) == len( repo_info_tuple ):
+ for item in rid_repo_info_tuple:
+ if item not in repo_info_tuple:
+ return False
+ return True
+ return False
def load_installed_datatype_converters( app, installed_repository_dict, deactivate=False ):
# Load or deactivate proprietary datatype converters
app.datatypes_registry.load_datatype_converters( app.toolbox, installed_repository_dict=installed_repository_dict, deactivate=deactivate )
@@ -874,6 +1205,80 @@
def load_installed_display_applications( app, installed_repository_dict, deactivate=False ):
# Load or deactivate proprietary datatype display applications
app.datatypes_registry.load_display_applications( installed_repository_dict=installed_repository_dict, deactivate=deactivate )
+def merge_containers_dicts_for_new_install( containers_dicts ):
+ """
+ When installing one or more tool shed repositories for the first time, the received list of containers_dicts contains a containers_dict for
+ each repository being installed. Since the repositories are being installed for the first time, all entries are None except the repository
+ dependencies and tool dependencies. The entries for missing dependencies are all None since they have previously been merged into the installed
+ dependencies. This method will merge the dependencies entries into a single container and return it for display.
+ """
+ new_containers_dict = dict( readme_files=None,
+ datatypes=None,
+ missing_repository_dependencies=None,
+ repository_dependencies=None,
+ missing_tool_dependencies=None,
+ tool_dependencies=None,
+ invalid_tools=None,
+ valid_tools=None,
+ workflows=None )
+ if containers_dicts:
+ lock = threading.Lock()
+ lock.acquire( True )
+ try:
+ repository_dependencies_root_folder = None
+ tool_dependencies_root_folder = None
+ # Use a unique folder id (hopefully the following is).
+ folder_id = 867
+ for old_container_dict in containers_dicts:
+ # Merge repository_dependencies.
+ old_container_repository_dependencies_root = old_container_dict[ 'repository_dependencies' ]
+ if old_container_repository_dependencies_root:
+ if repository_dependencies_root_folder is None:
+ repository_dependencies_root_folder = container_util.Folder( id=folder_id, key='root', label='root', parent=None )
+ folder_id += 1
+ repository_dependencies_folder = container_util.Folder( id=folder_id,
+ key='merged',
+ label='Repository dependencies',
+ parent=repository_dependencies_root_folder )
+ folder_id += 1
+ # The old_container_repository_dependencies_root will be a root folder containing a single sub_folder.
+ old_container_repository_dependencies_folder = old_container_repository_dependencies_root.folders[ 0 ]
+ # Change the folder id so it won't confict with others being merged.
+ old_container_repository_dependencies_folder.id = folder_id
+ folder_id += 1
+ # Generate the label by retrieving the repository name.
+ toolshed, name, owner, changeset_revision = container_util.get_components_from_key( old_container_repository_dependencies_folder.key )
+ old_container_repository_dependencies_folder.label = str( name )
+ repository_dependencies_folder.folders.append( old_container_repository_dependencies_folder )
+ # Merge tool_dependencies.
+ old_container_tool_dependencies_root = old_container_dict[ 'tool_dependencies' ]
+ if old_container_tool_dependencies_root:
+ if tool_dependencies_root_folder is None:
+ tool_dependencies_root_folder = container_util.Folder( id=folder_id, key='root', label='root', parent=None )
+ folder_id += 1
+ tool_dependencies_folder = container_util.Folder( id=folder_id,
+ key='merged',
+ label='Tool dependencies',
+ parent=tool_dependencies_root_folder )
+ folder_id += 1
+ else:
+ td_list = [ td.listify for td in tool_dependencies_folder.tool_dependencies ]
+ # The old_container_tool_dependencies_root will be a root folder containing a single sub_folder.
+ old_container_tool_dependencies_folder = old_container_tool_dependencies_root.folders[ 0 ]
+ for td in old_container_tool_dependencies_folder.tool_dependencies:
+ if td.listify not in td_list:
+ tool_dependencies_folder.tool_dependencies.append( td )
+ if repository_dependencies_root_folder:
+ repository_dependencies_root_folder.folders.append( repository_dependencies_folder )
+ new_containers_dict[ 'repository_dependencies' ] = repository_dependencies_root_folder
+ if tool_dependencies_root_folder:
+ tool_dependencies_root_folder.folders.append( tool_dependencies_folder )
+ new_containers_dict[ 'tool_dependencies' ] = tool_dependencies_root_folder
+ except Exception, e:
+ log.debug( "Exception in merge_containers_dicts_for_new_install: %s" % str( e ) )
+ finally:
+ lock.release()
+ return new_containers_dict
def panel_entry_per_tool( tool_section_dict ):
# Return True if tool_section_dict looks like this.
# {<Tool guid> : [{ tool_config : <tool_config_file>, id: <ToolSection id>, version : <ToolSection version>, name : <TooSection name>}]}
@@ -887,9 +1292,9 @@
if k not in [ 'id', 'version', 'name' ]:
return True
return False
-def populate_containers_dict_from_repository_metadata( trans, tool_shed_url, tool_path, repository, reinstalling=False ):
+def populate_containers_dict_from_repository_metadata( trans, tool_shed_url, tool_path, repository, reinstalling=False, required_repo_info_dicts=None ):
"""
- Retrieve necessary information from the received repository's metadata to populate the containers_dict for display. This methos is called only
+ Retrieve necessary information from the received repository's metadata to populate the containers_dict for display. This method is called only
from Galaxy (not the tool shed) when displaying repository dependencies for installed repositories and when displaying them for uninstalled
repositories that are being reinstalled.
"""
@@ -902,38 +1307,41 @@
# Handle README files.
if repository.has_readme_files:
if reinstalling:
- # Since we're reinstalling, we need to sned a request to the tool shed to get the README files.
+ # Since we're reinstalling, we need to send a request to the tool shed to get the README files.
url = suc.url_join( tool_shed_url,
'repository/get_readme_files?name=%s&owner=%s&changeset_revision=%s' % \
( repository.name, repository.owner, repository.installed_changeset_revision ) )
response = urllib2.urlopen( url )
raw_text = response.read()
response.close()
- readme_files_dict = from_json_string( raw_text )
+ readme_files_dict = json.from_json_string( raw_text )
else:
readme_files_dict = suc.build_readme_files_dict( repository.metadata, tool_path )
else:
readme_files_dict = None
# Handle repository dependencies.
installed_repository_dependencies, missing_repository_dependencies = get_installed_and_missing_repository_dependencies( trans, repository )
- # Handle tool dependencies.
- all_tool_dependencies = metadata.get( 'tool_dependencies', None )
- installed_tool_dependencies, missing_tool_dependencies = get_installed_and_missing_tool_dependencies( trans, repository, all_tool_dependencies )
+ # Handle the current repository's tool dependencies.
+ repository_tool_dependencies = metadata.get( 'tool_dependencies', None )
+ repository_installed_tool_dependencies, repository_missing_tool_dependencies = get_installed_and_missing_tool_dependencies( trans,
+ repository,
+ repository_tool_dependencies )
if reinstalling:
- # All tool dependencies will be considered missing since we are reinstalling the repository.
- if installed_tool_dependencies:
- for td in installed_tool_dependencies:
- missing_tool_dependencies.append( td )
- installed_tool_dependencies = None
+ installed_tool_dependencies, missing_tool_dependencies = \
+ populate_tool_dependencies_dicts( trans=trans,
+ tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ repository_installed_tool_dependencies=repository_installed_tool_dependencies,
+ repository_missing_tool_dependencies=repository_missing_tool_dependencies,
+ required_repo_info_dicts=required_repo_info_dicts )
+ else:
+ installed_tool_dependencies = repository_installed_tool_dependencies
+ missing_tool_dependencies = repository_missing_tool_dependencies
# Handle valid tools.
valid_tools = metadata.get( 'tools', None )
# Handle workflows.
workflows = metadata.get( 'workflows', None )
containers_dict = suc.build_repository_containers_for_galaxy( trans=trans,
- toolshed_base_url=tool_shed_url,
- repository_name=repository.name,
- repository_owner=repository.owner,
- changeset_revision=repository.installed_changeset_revision,
repository=repository,
datatypes=datatypes,
invalid_tools=invalid_tools,
@@ -943,7 +1351,9 @@
repository_dependencies=installed_repository_dependencies,
tool_dependencies=installed_tool_dependencies,
valid_tools=valid_tools,
- workflows=workflows )
+ workflows=workflows,
+ new_install=False,
+ reinstalling=reinstalling )
else:
containers_dict = dict( datatypes=None,
invalid_tools=None,
@@ -953,6 +1363,90 @@
valid_tools=None,
workflows=None )
return containers_dict
+def populate_containers_dict_for_new_install( trans, tool_shed_url, tool_path, readme_files_dict, installed_repository_dependencies, missing_repository_dependencies,
+ installed_tool_dependencies, missing_tool_dependencies ):
+ """Return the populated containers for a repository being installed for the first time."""
+ installed_tool_dependencies, missing_tool_dependencies = populate_tool_dependencies_dicts( trans=trans,
+ tool_shed_url=tool_shed_url,
+ tool_path=tool_path,
+ repository_installed_tool_dependencies=installed_tool_dependencies,
+ repository_missing_tool_dependencies=missing_tool_dependencies,
+ required_repo_info_dicts=None )
+ # Since we are installing a new repository, most of the repository contents are set to None since we don't yet know what they are.
+ containers_dict = suc.build_repository_containers_for_galaxy( trans=trans,
+ repository=None,
+ datatypes=None,
+ invalid_tools=None,
+ missing_repository_dependencies=missing_repository_dependencies,
+ missing_tool_dependencies=missing_tool_dependencies,
+ readme_files_dict=readme_files_dict,
+ repository_dependencies=installed_repository_dependencies,
+ tool_dependencies=installed_tool_dependencies,
+ valid_tools=None,
+ workflows=None,
+ new_install=True,
+ reinstalling=False )
+ # Merge the missing_repository_dependencies container contents to the installed_repository_dependencies container.
+ containers_dict = suc.merge_missing_repository_dependencies_to_installed_container( containers_dict )
+ # Merge the missing_tool_dependencies container contents to the installed_tool_dependencies container.
+ containers_dict = suc.merge_missing_tool_dependencies_to_installed_container( containers_dict )
+ return containers_dict
+def populate_tool_dependencies_dicts( trans, tool_shed_url, tool_path, repository_installed_tool_dependencies, repository_missing_tool_dependencies,
+ required_repo_info_dicts ):
+ """
+ Return the populated installed_tool_dependencies and missing_tool_dependencies dictionaries for all repositories defined by entries in the received
+ required_repo_info_dicts.
+ """
+ installed_tool_dependencies = None
+ missing_tool_dependencies = None
+ if repository_installed_tool_dependencies is None:
+ repository_installed_tool_dependencies = {}
+ else:
+ # Add the install_dir attribute to the tool_dependencies.
+ repository_installed_tool_dependencies = suc.add_installation_directories_to_tool_dependencies( trans=trans,
+ tool_dependencies=repository_installed_tool_dependencies )
+ if repository_missing_tool_dependencies is None:
+ repository_missing_tool_dependencies = {}
+ else:
+ # Add the install_dir attribute to the tool_dependencies.
+ repository_missing_tool_dependencies = suc.add_installation_directories_to_tool_dependencies( trans=trans,
+ tool_dependencies=repository_missing_tool_dependencies )
+ if required_repo_info_dicts:
+ # Handle the tool dependencies defined for each of the repository's repository dependencies.
+ for rid in required_repo_info_dicts:
+ for name, repo_info_tuple in rid.items():
+ description, repository_clone_url, changeset_revision, ctx_rev, repository_owner, repository_dependencies, tool_dependencies = \
+ suc.get_repo_info_tuple_contents( repo_info_tuple )
+ if tool_dependencies:
+ # Add the install_dir attribute to the tool_dependencies.
+ tool_dependencies = suc.add_installation_directories_to_tool_dependencies( trans=trans,
+ tool_dependencies=tool_dependencies )
+ # The required_repository may have been installed with a different changeset revision.
+ required_repository, installed_changeset_revision = repository_was_previously_installed( trans, tool_shed_url, name, repo_info_tuple )
+ if required_repository:
+ required_repository_installed_tool_dependencies, required_repository_missing_tool_dependencies = \
+ get_installed_and_missing_tool_dependencies( trans, required_repository, tool_dependencies )
+ if required_repository_installed_tool_dependencies:
+ # Add the install_dir attribute to the tool_dependencies.
+ required_repository_installed_tool_dependencies = \
+ suc.add_installation_directories_to_tool_dependencies( trans=trans,
+ tool_dependencies=required_repository_installed_tool_dependencies )
+ for td_key, td_dict in required_repository_installed_tool_dependencies.items():
+ if td_key not in repository_installed_tool_dependencies:
+ repository_installed_tool_dependencies[ td_key ] = td_dict
+ if required_repository_missing_tool_dependencies:
+ # Add the install_dir attribute to the tool_dependencies.
+ required_repository_missing_tool_dependencies = \
+ suc.add_installation_directories_to_tool_dependencies( trans=trans,
+ tool_dependencies=required_repository_missing_tool_dependencies )
+ for td_key, td_dict in required_repository_missing_tool_dependencies.items():
+ if td_key not in repository_missing_tool_dependencies:
+ repository_missing_tool_dependencies[ td_key ] = td_dict
+ if repository_installed_tool_dependencies:
+ installed_tool_dependencies = repository_installed_tool_dependencies
+ if repository_missing_tool_dependencies:
+ missing_tool_dependencies = repository_missing_tool_dependencies
+ return installed_tool_dependencies, missing_tool_dependencies
def pull_repository( repo, repository_clone_url, ctx_rev ):
"""Pull changes from a remote repository to a local one."""
commands.pull( suc.get_configured_ui(), repo, source=repository_clone_url, rev=[ ctx_rev ] )
@@ -1114,7 +1608,7 @@
trans.sa_session.add( tool_dependency )
trans.sa_session.flush()
return removed, error_message
-def repository_was_previously_installed( trans, tool_shed_url, repository_name, repo_info_tuple, clone_dir ):
+def repository_was_previously_installed( trans, tool_shed_url, repository_name, repo_info_tuple ):
"""
Handle the case where the repository was previously installed using an older changeset_revsion, but later the repository was updated
in the tool shed and now we're trying to install the latest changeset revision of the same repository instead of updating the one
@@ -1132,7 +1626,6 @@
text = response.read()
response.close()
if text:
- #clone_path, clone_directory = os.path.split( clone_dir )
changeset_revisions = util.listify( text )
for previous_changeset_revision in changeset_revisions:
tool_shed_repository = suc.get_tool_shed_repository_by_shed_name_owner_installed_changeset_revision( trans.app,
This diff is so big that we needed to truncate the remainder.
https://bitbucket.org/galaxy/galaxy-central/commits/4e15bce31ca2/
Changeset: 4e15bce31ca2
User: kellrott
Date: 2013-03-08 20:44:29
Summary: Adding security to the extended metadata controller and mixin.
Affected #: 3 files
diff -r d603df6f2e7978f9c75f64190bfba97bbf59e851 -r 4e15bce31ca2a1a85c4931c794638f77e712cbe9 lib/galaxy/model/item_attrs.py
--- a/lib/galaxy/model/item_attrs.py
+++ b/lib/galaxy/model/item_attrs.py
@@ -159,82 +159,6 @@
return getattr( galaxy.model, class_name, None )
-class UsesExtendedMetadata:
- """ Mixin for getting and setting item extended metadata. """
-
- def get_item_extended_metadata_obj( self, db_session, user, item ):
- """
- Given an item object (such as a LibraryDatasetDatasetAssociation), find the object
- of the associated extended metadata
- """
-
- extended_metadata = db_session.query( galaxy.model.ExtendedMetadata )
-
- if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
- extended_metadata = extended_metadata.join( galaxy.model.LibraryDatasetDatasetAssociation ).filter(
- galaxy.model.LibraryDatasetDatasetAssociation.id == item.id )
-
- return extended_metadata.first()
-
-
- def set_item_extended_metadata_obj( self, db_session, user, item, extmeta_obj):
- if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
- item.extended_metadata = extmeta_obj
- db_session.flush()
-
- def unlink_item_extended_metadata_obj( self, db_session, user, item):
- if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
- item.extended_metadata = None
- db_session.flush()
-
- def create_extended_metadata(self, db_session, user, extmeta):
- """
- Create/index an extended metadata object. The returned object is
- not associated with any items
- """
- ex_meta = galaxy.model.ExtendedMetadata(extmeta)
- db_session.add( ex_meta )
- db_session.flush()
- for path, value in self._scan_json_block(extmeta):
- meta_i = galaxy.model.ExtendedMetadataIndex(ex_meta, path, value)
- db_session.add(meta_i)
- db_session.flush()
- return ex_meta
-
- def delete_extended_metadata( self, db_session, user, item):
- if item.__class__ == galaxy.model.ExtendedMetadata:
- db_session.delete( item )
- db_session.flush()
-
- def _scan_json_block(self, meta, prefix=""):
- """
- Scan a json style data structure, and emit all fields and their values.
- Example paths
-
- Data
- { "data" : [ 1, 2, 3 ] }
-
- Path:
- /data == [1,2,3]
-
- /data/[0] == 1
-
- """
- if isinstance(meta, dict):
- for a in meta:
- for path, value in self._scan_json_block(meta[a], prefix + "/" + a):
- yield path, value
- elif isinstance(meta, list):
- for i, a in enumerate(meta):
- for path, value in self._scan_json_block(a, prefix + "[%d]" % (i)):
- yield path, value
- else:
- #BUG: Everything is cast to string, which can lead to false positives
- #for cross type comparisions, ie "True" == True
- yield prefix, ("%s" % (meta)).encode("utf8", errors='replace')
-
-
-
class APIItem:
""" Mixin for api representation. """
#api_collection_visible_keys = ( 'id' )
diff -r d603df6f2e7978f9c75f64190bfba97bbf59e851 -r 4e15bce31ca2a1a85c4931c794638f77e712cbe9 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -1619,6 +1619,85 @@
log.debug( "In get_item_tag_assoc with tagged_item %s" % tagged_item )
return self.get_tag_handler( trans )._get_item_tag_assoc( user, tagged_item, tag_name )
+
+
+class UsesExtendedMetadataMixin( SharableItemSecurityMixin ):
+ """ Mixin for getting and setting item extended metadata. """
+
+ def get_item_extended_metadata_obj( self, trans, item ):
+ """
+ Given an item object (such as a LibraryDatasetDatasetAssociation), find the object
+ of the associated extended metadata
+ """
+
+ extended_metadata = trans.sa_session.query( galaxy.model.ExtendedMetadata )
+
+ if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
+ extended_metadata = extended_metadata.join( galaxy.model.LibraryDatasetDatasetAssociation ).filter(
+ galaxy.model.LibraryDatasetDatasetAssociation.id == item.id )
+
+ return extended_metadata.first()
+
+
+ def set_item_extended_metadata_obj( self, trans, item, extmeta_obj, check_writable=False):
+ if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
+ if not check_writable or trans.app.security_agent.can_modify_library_item( trans.get_current_user_roles(), item, trans.user ):
+ item.extended_metadata = extmeta_obj
+ trans.sa_session.flush()
+
+ def unset_item_extended_metadata_obj( self, trans, item, check_writable=False):
+ if item.__class__ == galaxy.model.LibraryDatasetDatasetAssociation:
+ if not check_writable or trans.app.security_agent.can_modify_library_item( trans.get_current_user_roles(), item, trans.user ):
+ item.extended_metadata = None
+ trans.sa_session.flush()
+
+ def create_extended_metadata(self, trans, extmeta):
+ """
+ Create/index an extended metadata object. The returned object is
+ not associated with any items
+ """
+ ex_meta = galaxy.model.ExtendedMetadata(extmeta)
+ trans.sa_session.add( ex_meta )
+ trans.sa_session.flush()
+ for path, value in self._scan_json_block(extmeta):
+ meta_i = galaxy.model.ExtendedMetadataIndex(ex_meta, path, value)
+ trans.sa_session.add(meta_i)
+ trans.sa_session.flush()
+ return ex_meta
+
+ def delete_extended_metadata( self, trans, item):
+ if item.__class__ == galaxy.model.ExtendedMetadata:
+ trans.sa_session.delete( item )
+ trans.sa_session.flush()
+
+ def _scan_json_block(self, meta, prefix=""):
+ """
+ Scan a json style data structure, and emit all fields and their values.
+ Example paths
+
+ Data
+ { "data" : [ 1, 2, 3 ] }
+
+ Path:
+ /data == [1,2,3]
+
+ /data/[0] == 1
+
+ """
+ if isinstance(meta, dict):
+ for a in meta:
+ for path, value in self._scan_json_block(meta[a], prefix + "/" + a):
+ yield path, value
+ elif isinstance(meta, list):
+ for i, a in enumerate(meta):
+ for path, value in self._scan_json_block(a, prefix + "[%d]" % (i)):
+ yield path, value
+ else:
+ #BUG: Everything is cast to string, which can lead to false positives
+ #for cross type comparisions, ie "True" == True
+ yield prefix, ("%s" % (meta)).encode("utf8", errors='replace')
+
+
"""
Deprecated: `BaseController` used to be available under the name `Root`
"""
diff -r d603df6f2e7978f9c75f64190bfba97bbf59e851 -r 4e15bce31ca2a1a85c4931c794638f77e712cbe9 lib/galaxy/webapps/galaxy/api/extended_metadata.py
--- a/lib/galaxy/webapps/galaxy/api/extended_metadata.py
+++ b/lib/galaxy/webapps/galaxy/api/extended_metadata.py
@@ -4,8 +4,7 @@
import logging, os, string, shutil, urllib, re, socket
from cgi import escape, FieldStorage
from galaxy import util, datatypes, jobs, web, util
-from galaxy.web.base.controller import BaseAPIController, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin
-from galaxy.model.item_attrs import UsesExtendedMetadata
+from galaxy.web.base.controller import BaseAPIController, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin, UsesExtendedMetadataMixin
from galaxy.util.sanitize_html import sanitize_html
import galaxy.datatypes
from galaxy.util.bunch import Bunch
@@ -16,38 +15,38 @@
log = logging.getLogger( __name__ )
-class BaseExtendedMetadataController( BaseAPIController, UsesExtendedMetadata, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin ):
+class BaseExtendedMetadataController( BaseAPIController, UsesExtendedMetadataMixin, UsesHistoryMixin, UsesLibraryMixinItems, UsesHistoryDatasetAssociationMixin, UsesStoredWorkflowMixin ):
@web.expose_api
def index( self, trans, **kwd ):
idnum = kwd[self.exmeta_item_id]
- item = self._get_item_from_id(trans, idnum)
+ item = self._get_item_from_id(trans, idnum, check_writable=False)
if item is not None:
- ex_meta = self.get_item_extended_metadata_obj( trans.sa_session, trans.get_user(), item )
+ ex_meta = self.get_item_extended_metadata_obj( trans, item )
if ex_meta is not None:
return ex_meta.data
@web.expose_api
def create( self, trans, payload, **kwd ):
idnum = kwd[self.exmeta_item_id]
- item = self._get_item_from_id(trans, idnum)
+ item = self._get_item_from_id(trans, idnum, check_writable=True)
if item is not None:
- ex_obj = self.get_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
+ ex_obj = self.get_item_extended_metadata_obj(trans, item)
if ex_obj is not None:
- self.unlink_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
- self.delete_extended_metadata(trans.sa_session, trans.get_user(), ex_obj)
- ex_obj = self.create_extended_metadata(trans.sa_session, trans.get_user(), payload)
- self.set_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item, ex_obj)
+ self.unset_item_extended_metadata_obj(trans, item)
+ self.delete_extended_metadata(trans, ex_obj)
+ ex_obj = self.create_extended_metadata(trans, payload)
+ self.set_item_extended_metadata_obj(trans, item, ex_obj)
@web.expose_api
def delete( self, trans, **kwd ):
idnum = kwd[self.tagged_item_id]
- item = self._get_item_from_id(trans, idnum)
+ item = self._get_item_from_id(trans, idnum, check_writable=True)
if item is not None:
- ex_obj = self.get_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
+ ex_obj = self.get_item_extended_metadata_obj(trans, item)
if ex_obj is not None:
- self.unlink_item_extended_metadata_obj(trans.sa_session, trans.get_user(), item)
- self.delete_extended_metadata(trans.sa_session, trans.get_user(), ex_obj)
+ self.unset_item_extended_metadata_obj(trans, item)
+ self.delete_extended_metadata(trans, ex_obj)
@web.expose_api
def undelete( self, trans, **kwd ):
@@ -56,7 +55,13 @@
class LibraryDatasetExtendMetadataController(BaseExtendedMetadataController):
controller_name = "library_dataset_extended_metadata"
exmeta_item_id = "library_content_id"
- def _get_item_from_id(self, trans, idstr):
- hist = self.get_library_dataset_dataset_association( trans, idstr )
- return hist
-
+ def _get_item_from_id(self, trans, idstr, check_writable=True):
+ if check_writable:
+ item = self.get_library_dataset_dataset_association( trans, idstr)
+ if trans.app.security_agent.can_modify_library_item( trans.get_current_user_roles(), item ):
+ return item
+ else:
+ item = self.get_library_dataset_dataset_association( trans, idstr)
+ if trans.app.security_agent.can_access_library_item( trans.get_current_user_roles(), item, trans.user ):
+ return item
+ return None
https://bitbucket.org/galaxy/galaxy-central/commits/bd61f8f7f90c/
Changeset: bd61f8f7f90c
User: kellrott
Date: 2013-03-18 04:30:50
Summary: Adding view of extended metadata on the ldda_info page
Affected #: 2 files
diff -r 4e15bce31ca2a1a85c4931c794638f77e712cbe9 -r bd61f8f7f90c02419d2a50b21da2a04f970a0fb6 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -49,6 +49,9 @@
from inflection import Inflector, English
inflector = Inflector(English)
+pkg_resources.require( "simplejson" )
+import simplejson
+
def is_multi_byte( chars ):
for char in chars:
try:
@@ -168,6 +171,11 @@
elem.tail = i + pad
return elem
+def pretty_print_json(json_data, is_json_string=False):
+ if is_json_string:
+ json_data = simplejson.loads(json_data)
+ return simplejson.dumps(json_data, sort_keys=True, indent=4 * ' ')
+
# characters that are valid
valid_chars = set(string.letters + string.digits + " -=_.()/+*^,:?!")
diff -r 4e15bce31ca2a1a85c4931c794638f77e712cbe9 -r bd61f8f7f90c02419d2a50b21da2a04f970a0fb6 templates/library/common/ldda_info.mako
--- a/templates/library/common/ldda_info.mako
+++ b/templates/library/common/ldda_info.mako
@@ -167,6 +167,13 @@
</div></div>
%endif
+ %if ldda.extended_metadata:
+ <div class="form-row">
+ <label>Extended Metadata:</label>
+ <pre>${util.pretty_print_json(ldda.extended_metadata.data)}</pre>
+ <div style="clear: both"></div>
+ </div>
+ %endif
%if trans.user_is_admin() and cntrller == 'library_admin':
<div class="form-row"><label>Disk file:</label>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: carlfeberhard: Visualization framework: allow plugins to be located outside Galaxy root; fix logging import
by commits-noreply@bitbucket.org 06 Aug '13
by commits-noreply@bitbucket.org 06 Aug '13
06 Aug '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/48fd0ab87841/
Changeset: 48fd0ab87841
User: carlfeberhard
Date: 2013-08-06 21:40:17
Summary: Visualization framework: allow plugins to be located outside Galaxy root; fix logging import
Affected #: 2 files
diff -r 73c9018fcafb9beb2f9fb76bae0a01e1c641b913 -r 48fd0ab87841725fe85726606e187cb728ce9a21 lib/galaxy/visualization/registry.py
--- a/lib/galaxy/visualization/registry.py
+++ b/lib/galaxy/visualization/registry.py
@@ -48,6 +48,7 @@
- validating and parsing params into resources (based on a context)
used in the visualization template
"""
+ #: name of this plugin
#: any built in visualizations that have their own render method in ctrls/visualization
# these should be handled somewhat differently - and be passed onto their resp. methods in ctrl.visualization
#TODO: change/remove if/when they can be updated to use this system
@@ -57,9 +58,6 @@
'sweepster',
'phyloviz'
]
- #: where to search for visualiztion templates (relative to templates/webapps/galaxy)
- # this can be overridden individually in the config entries
- TEMPLATE_ROOT = 'visualization'
#: directories under plugin_directory that aren't plugins
non_plugin_directories = []
@@ -67,7 +65,7 @@
return 'VisualizationsRegistry(%s)' %( self.plugin_directory )
def __init__( self, registry_filepath, template_cache_dir ):
- super( VisualizationsRegistry, self ).__init__( registry_filepath, template_cache_dir )
+ super( VisualizationsRegistry, self ).__init__( registry_filepath, 'visualizations', template_cache_dir )
# what to use to parse query strings into resources/vars for the template
self.resource_parser = ResourceParser()
diff -r 73c9018fcafb9beb2f9fb76bae0a01e1c641b913 -r 48fd0ab87841725fe85726606e187cb728ce9a21 lib/galaxy/web/base/pluginframework.py
--- a/lib/galaxy/web/base/pluginframework.py
+++ b/lib/galaxy/web/base/pluginframework.py
@@ -14,6 +14,8 @@
pkg_resources.require( 'Mako' )
import mako
+import logging
+log = logging.getLogger( __name__ )
# ============================================================================= exceptions
class PluginFrameworkException( Exception ):
@@ -74,7 +76,13 @@
if not config_plugin_directory:
return None
try:
+ # create the plugin path and if plugin dir begins with '/' assume absolute path
full_plugin_filepath = os.path.join( config.root, config_plugin_directory )
+ if config_plugin_directory.startswith( os.path.sep ):
+ full_plugin_filepath = config_plugin_directory
+ if not os.path.exists( full_plugin_filepath ):
+ raise PluginFrameworkException( 'Plugin path not found: %s' %( full_plugin_filepath ) )
+
template_cache = config.template_cache if cls.serves_static else None
plugin = cls( full_plugin_filepath, template_cache )
@@ -88,13 +96,22 @@
def __str__( self ):
return '%s(%s)' %( self.__class__.__name__, self.plugin_directory )
- def __init__( self, plugin_directory, template_cache_dir=None, debug=False ):
+ def __init__( self, plugin_directory, name=None, template_cache_dir=None, debug=False ):
+ """
+ :type plugin_directory: string
+ :param plugin_directory: the base directory where plugin code is kept
+ :type name: (optional) string (default: None)
+ :param name: the name of this plugin
+ (that will appear in url pathing, etc.)
+ :type template_cache_dir: (optional) string (default: None)
+ :param template_cache_dir: the cache directory to store compiled mako
+ """
if not os.path.isdir( plugin_directory ):
raise PluginFrameworkException( 'Framework plugin directory not found: %s, %s'
%( self.__class__.__name__, plugin_directory ) )
- # absolute (?) path
self.plugin_directory = plugin_directory
- self.name = os.path.basename( self.plugin_directory )
+ #TODO: or pass in from config
+ self.name = name or os.path.basename( self.plugin_directory )
if self.has_config:
self.load_configuration()
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0