galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
February 2013
- 2 participants
- 189 discussions
14 Feb '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/c58f82badc26/
changeset: c58f82badc26
user: dan
date: 2013-02-14 21:20:38
summary: Enhance TabularToolDataTable.get_entry()
affected #: 1 file
diff -r caaf60ca4511b532b58ec5e1de85e39a0067ae0d -r c58f82badc26168885507312b71ac2351046bee8 lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -280,23 +280,22 @@
rval.append( None )
return rval
- def get_entry( self, query_attr, query_val, return_attr ):
+ def get_entry( self, query_attr, query_val, return_attr, default=None ):
"""
Returns table entry associated with a col/val pair.
"""
query_col = self.columns.get( query_attr, None )
- if not query_col:
- return None
+ if query_col is None:
+ return default
return_col = self.columns.get( return_attr, None )
- if not return_col:
- return None
- rval = None
+ if return_col is None:
+ return default
+ rval = default
# Look for table entry.
for fields in self.data:
if fields[ query_col ] == query_val:
rval = fields[ return_col ]
break
-
return rval
# Registry of tool data types by type_key
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
14 Feb '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/caaf60ca4511/
changeset: caaf60ca4511
user: dan
date: 2013-02-14 21:16:44
summary: Fix broken TabularToolDataTable.get_entry()
affected #: 1 file
diff -r 138642a8dc55689a3cab1d070aac72b293265bc9 -r caaf60ca4511b532b58ec5e1de85e39a0067ae0d lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -290,7 +290,7 @@
return_col = self.columns.get( return_attr, None )
if not return_col:
return None
-
+ rval = None
# Look for table entry.
for fields in self.data:
if fields[ query_col ] == query_val:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
14 Feb '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/138642a8dc55/
changeset: 138642a8dc55
user: dan
date: 2013-02-14 21:13:06
summary: First pass at a plugable Data Manager. Data Managers can be defined locally or installed from a Tool Shed. TODO: Display Data Manager information in the Tool Shed. Do not install Data Managers into Tool Panel. Add tests. Write Docs.
affected #: 28 files
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 buildbot_setup.sh
--- a/buildbot_setup.sh
+++ b/buildbot_setup.sh
@@ -70,6 +70,8 @@
tool_data_table_conf.xml.sample
shed_tool_data_table_conf.xml.sample
migrated_tools_conf.xml.sample
+data_manager_conf.xml.sample
+shed_data_manager_conf.xml.sample
tool-data/shared/ensembl/builds.txt.sample
tool-data/shared/igv/igv_build_sites.txt.sample
tool-data/shared/ncbi/builds.txt.sample
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 data_manager_conf.xml.sample
--- /dev/null
+++ b/data_manager_conf.xml.sample
@@ -0,0 +1,3 @@
+<?xml version="1.0"?>
+<data_managers>
+</data_managers>
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 datatypes_conf.xml.sample
--- a/datatypes_conf.xml.sample
+++ b/datatypes_conf.xml.sample
@@ -59,6 +59,7 @@
<datatype extension="bowtie_base_index" type="galaxy.datatypes.ngsindex:BowtieBaseIndex" mimetype="text/html" display_in_upload="False"/><datatype extension="csfasta" type="galaxy.datatypes.sequence:csFasta" display_in_upload="true"/><datatype extension="data" type="galaxy.datatypes.data:Data" mimetype="application/octet-stream" max_optional_metadata_filesize="1048576" />
+ <datatype extension="data_manager_json" type="galaxy.datatypes.data:Text" mimetype="application/json" subclass="True" display_in_upload="False"/><datatype extension="fasta" type="galaxy.datatypes.sequence:Fasta" display_in_upload="true"><converter file="fasta_to_tabular_converter.xml" target_datatype="tabular"/><converter file="fasta_to_bowtie_base_index_converter.xml" target_datatype="bowtie_base_index"/>
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/app.py
--- a/lib/galaxy/app.py
+++ b/lib/galaxy/app.py
@@ -18,6 +18,7 @@
from galaxy.tools.genome_index import load_genome_index_tools
from galaxy.sample_tracking import external_service_types
from galaxy.openid.providers import OpenIDProviders
+from galaxy.tools.data_manager.manager import DataManagers
class UniverseApplication( object ):
"""Encapsulates the state of a Universe application"""
@@ -95,6 +96,8 @@
self.toolbox = tools.ToolBox( tool_configs, self.config.tool_path, self )
# Search support for tools
self.toolbox_search = galaxy.tools.search.ToolBoxSearch( self.toolbox )
+ # Load Data Manager
+ self.data_managers = DataManagers( self )
# If enabled, poll respective tool sheds to see if updates are available for any installed tool shed repositories.
if self.config.get_bool( 'enable_tool_shed_check', False ):
from tool_shed import update_manager
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/config.py
--- a/lib/galaxy/config.py
+++ b/lib/galaxy/config.py
@@ -75,6 +75,10 @@
except:
self.hours_between_check = 12
self.update_integrated_tool_panel = kwargs.get( "update_integrated_tool_panel", True )
+ self.enable_data_manager_user_view = string_as_bool( kwargs.get( "enable_data_manager_user_view", "False" ) )
+ self.data_manager_config_file = resolve_path( kwargs.get('data_manager_config_file', 'data_manager_conf.xml' ), self.root )
+ self.shed_data_manager_config_file = resolve_path( kwargs.get('shed_data_manager_config_file', 'shed_data_manager_conf.xml' ), self.root )
+ self.galaxy_data_manager_data_path = kwargs.get( 'galaxy_data_manager_data_path', self.tool_data_path )
self.tool_secret = kwargs.get( "tool_secret", "" )
self.id_secret = kwargs.get( "id_secret", "USING THE DEFAULT IS NOT SECURE!" )
self.set_metadata_externally = string_as_bool( kwargs.get( "set_metadata_externally", "False" ) )
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -2998,6 +2998,20 @@
def set_item( self, visualization ):
self.visualization = visualization
+#Data Manager Classes
+class DataManagerHistoryAssociation( object ):
+ def __init__( self, id=None, history=None, user=None ):
+ self.id = id
+ self.history = history
+ self.user = user
+
+class DataManagerJobAssociation( object ):
+ def __init__( self, id=None, job=None, data_manager_id=None ):
+ self.id = id
+ self.job = job
+ self.data_manager_id = data_manager_id
+#end of Data Manager Classes
+
class UserPreference ( object ):
def __init__( self, name=None, value=None ):
self.name = name
@@ -3165,6 +3179,11 @@
return 'repository_dependencies' in self.metadata
return False
@property
+ def includes_data_managers( self ):
+ if self.metadata:
+ return bool( len( self.metadata.get( 'data_manager', {} ).get( 'data_managers', {} ) ) )
+ return False
+ @property
def includes_tools( self ):
if self.metadata:
return 'tools' in self.metadata
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py
+++ b/lib/galaxy/model/mapping.py
@@ -932,6 +932,23 @@
Column( "user_id", Integer, ForeignKey( "galaxy_user.id" ), index=True )
)
+#Data Manager tables
+DataManagerHistoryAssociation.table = Table( "data_manager_history_association", metadata,
+ Column( "id", Integer, primary_key=True),
+ Column( "create_time", DateTime, default=now ),
+ Column( "update_time", DateTime, index=True, default=now, onupdate=now ),
+ Column( "history_id", Integer, ForeignKey( "history.id" ), index=True ),
+ Column( "user_id", Integer, ForeignKey( "galaxy_user.id" ), index=True )
+ )
+
+DataManagerJobAssociation.table = Table( "data_manager_job_association", metadata,
+ Column( "id", Integer, primary_key=True),
+ Column( "create_time", DateTime, default=now ),
+ Column( "update_time", DateTime, index=True, default=now, onupdate=now ),
+ Column( "job_id", Integer, ForeignKey( "job.id" ), index=True ),
+ Column( "data_manager_id", TEXT, index=True )
+ )
+
# Tagging tables.
Tag.table = Table( "tag", metadata,
@@ -1901,6 +1918,17 @@
properties=dict( visualization=relation( Visualization ), user=relation( User ) )
)
+#Data Manager tables
+assign_mapper( context, DataManagerHistoryAssociation, DataManagerHistoryAssociation.table,
+ properties=dict( history=relation( History ),
+ user=relation( User, backref='data_manager_histories' )
+ )
+ )
+
+assign_mapper( context, DataManagerJobAssociation, DataManagerJobAssociation.table,
+ properties=dict( job=relation( Job, backref=backref('data_manager_association', uselist=False ), uselist=False ) )
+ )
+
# User tables.
assign_mapper( context, UserPreference, UserPreference.table,
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/model/migrate/versions/0112_add_data_manager_history_association_and_data_manager_job_association_tables.py
--- /dev/null
+++ b/lib/galaxy/model/migrate/versions/0112_add_data_manager_history_association_and_data_manager_job_association_tables.py
@@ -0,0 +1,66 @@
+"""
+Migration script to add the data_manager_history_association table and data_manager_job_association.
+"""
+from sqlalchemy import *
+from sqlalchemy.orm import *
+from migrate import *
+from migrate.changeset import *
+import sys, logging
+from galaxy.model.custom_types import *
+from sqlalchemy.exc import *
+import datetime
+now = datetime.datetime.utcnow
+
+log = logging.getLogger( __name__ )
+log.setLevel( logging.DEBUG )
+handler = logging.StreamHandler( sys.stdout )
+format = "%(name)s %(levelname)s %(asctime)s %(message)s"
+formatter = logging.Formatter( format )
+handler.setFormatter( formatter )
+log.addHandler( handler )
+
+metadata = MetaData( migrate_engine )
+
+DataManagerHistoryAssociation_table = Table( "data_manager_history_association", metadata,
+ Column( "id", Integer, primary_key=True),
+ Column( "create_time", DateTime, default=now ),
+ Column( "update_time", DateTime, index=True, default=now, onupdate=now ),
+ Column( "history_id", Integer, ForeignKey( "history.id" ), index=True ),
+ Column( "user_id", Integer, ForeignKey( "galaxy_user.id" ), index=True )
+ )
+
+DataManagerJobAssociation_table = Table( "data_manager_job_association", metadata,
+ Column( "id", Integer, primary_key=True),
+ Column( "create_time", DateTime, default=now ),
+ Column( "update_time", DateTime, index=True, default=now, onupdate=now ),
+ Column( "job_id", Integer, ForeignKey( "job.id" ), index=True ),
+ Column( "data_manager_id", TEXT, index=True )
+ )
+
+def upgrade():
+ print __doc__
+ metadata.reflect()
+ try:
+ DataManagerHistoryAssociation_table.create()
+ log.debug( "Created data_manager_history_association table" )
+ except Exception, e:
+ log.debug( "Creating data_manager_history_association table failed: %s" % str( e ) )
+ try:
+ DataManagerJobAssociation_table.create()
+ log.debug( "Created data_manager_job_association table" )
+ except Exception, e:
+ log.debug( "Creating data_manager_job_association table failed: %s" % str( e ) )
+
+
+def downgrade():
+ metadata.reflect()
+ try:
+ DataManagerHistoryAssociation_table.drop()
+ log.debug( "Dropped data_manager_history_association table" )
+ except Exception, e:
+ log.debug( "Dropping data_manager_history_association table failed: %s" % str( e ) )
+ try:
+ DataManagerJobAssociation_table.drop()
+ log.debug( "Dropped data_manager_job_association table" )
+ except Exception, e:
+ log.debug( "Dropping data_manager_job_association table failed: %s" % str( e ) )
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -7,7 +7,7 @@
pkg_resources.require( "MarkupSafe" ) #MarkupSafe must load before mako
pkg_resources.require( "Mako" )
-import logging, os, string, sys, tempfile, glob, shutil, types, urllib, subprocess, random, math, traceback, re
+import logging, os, string, sys, tempfile, glob, shutil, types, urllib, subprocess, random, math, traceback, re, pipes
import simplejson
import binascii
from mako.template import Template
@@ -26,6 +26,7 @@
from galaxy.util.expressions import ExpressionContext
from galaxy.tools.test import ToolTestBuilder
from galaxy.tools.actions import DefaultToolAction
+from galaxy.tools.actions.data_manager import DataManagerToolAction
from galaxy.tools.deps import DependencyManager
from galaxy.model import directory_hash_id
from galaxy.model.orm import *
@@ -86,6 +87,7 @@
# In-memory dictionary that defines the layout of the tool panel.
self.tool_panel = odict()
self.index = 0
+ self.data_manager_tools = odict()
# File that contains the XML section and tool tags from all tool panel config files integrated into a
# single file that defines the tool panel layout. This file can be changed by the Galaxy administrator
# (in a way similar to the single tool_conf.xml file in the past) to alter the layout of the tool panel.
@@ -515,7 +517,7 @@
self.integrated_tool_panel[ key ] = integrated_section
else:
self.integrated_tool_panel.insert( index, key, integrated_section )
- def load_tool( self, config_file, guid=None ):
+ def load_tool( self, config_file, guid=None, **kwds ):
"""Load a single tool from the file named by `config_file` and return an instance of `Tool`."""
# Parse XML configuration file and get the root element
tree = util.parse_xml( config_file )
@@ -531,7 +533,7 @@
ToolClass = tool_types.get( root.get( 'tool_type' ) )
else:
ToolClass = Tool
- return ToolClass( config_file, root, self.app, guid=guid )
+ return ToolClass( config_file, root, self.app, guid=guid, **kwds )
def reload_tool_by_id( self, tool_id ):
"""
Attempt to reload the tool identified by 'tool_id', if successful
@@ -569,6 +571,34 @@
message += "<b>version:</b> %s" % old_tool.version
status = 'done'
return message, status
+ def remove_tool_by_id( self, tool_id ):
+ """
+ Attempt to remove the tool identified by 'tool_id'.
+ """
+ if tool_id not in self.tools_by_id:
+ message = "No tool with id %s" % tool_id
+ status = 'error'
+ else:
+ tool = self.tools_by_id[ tool_id ]
+ del self.tools_by_id[ tool_id ]
+ tool_key = 'tool_' + tool_id
+ for key, val in self.tool_panel.items():
+ if key == tool_key:
+ del self.tool_panel[ key ]
+ break
+ elif key.startswith( 'section' ):
+ if tool_key in val.elems:
+ del self.tool_panel[ key ].elems[ tool_key ]
+ break
+ if tool_id in self.data_manager_tools:
+ del self.data_manager_tools[ tool_id ]
+ #TODO: do we need to manually remove from the integrated panel here?
+ message = "Removed the tool:<br/>"
+ message += "<b>name:</b> %s<br/>" % tool.name
+ message += "<b>id:</b> %s<br/>" % tool.id
+ message += "<b>version:</b> %s" % tool.version
+ status = 'done'
+ return message, status
def load_workflow( self, workflow_id ):
"""
Return an instance of 'Workflow' identified by `id`,
@@ -818,6 +848,7 @@
"""
tool_type = 'default'
+ default_tool_action = DefaultToolAction
def __init__( self, config_file, root, app, guid=None ):
"""Load a tool from the config named by `config_file`"""
@@ -1070,7 +1101,7 @@
# Action
action_elem = root.find( "action" )
if action_elem is None:
- self.tool_action = DefaultToolAction()
+ self.tool_action = self.default_tool_action()
else:
module = action_elem.get( 'module' )
cls = action_elem.get( 'class' )
@@ -2612,12 +2643,14 @@
extra_dir="dataset_%d_files" % hda.dataset.id,
alt_name = f,
file_name = os.path.join(temp_file_path, f),
- create = True)
+ create = True,
+ preserve_symlinks = True )
# Clean up after being handled by object store.
# FIXME: If the object (e.g., S3) becomes async, this will
# cause issues so add it to the object store functionality?
shutil.rmtree(temp_file_path)
- except:
+ except Exception, e:
+ log.debug( "Error in collect_associated_files: %s" % ( e ) )
continue
def collect_child_datasets( self, output, job_working_directory ):
"""
@@ -2843,7 +2876,64 @@
return tool_dict
-class DataSourceTool( Tool ):
+ def get_default_history_by_trans( self, trans, create=False ):
+ return trans.get_history( create=create )
+
+
+class OutputParameterJSONTool( Tool ):
+ """
+ Alternate implementation of Tool that provides parameters and other values
+ JSONified within the contents of an output dataset
+ """
+ tool_type = 'output_parameter_json'
+ def _prepare_json_list( self, param_list ):
+ rval = []
+ for value in param_list:
+ if isinstance( value, dict ):
+ rval.append( self._prepare_json_param_dict( value ) )
+ elif isinstance( value, list ):
+ rval.append( self._prepare_json_list( value ) )
+ else:
+ rval.append( str( value ) )
+ return rval
+ def _prepare_json_param_dict( self, param_dict ):
+ rval = {}
+ for key, value in param_dict.iteritems():
+ if isinstance( value, dict ):
+ rval[ key ] = self._prepare_json_param_dict( value )
+ elif isinstance( value, list ):
+ rval[ key ] = self._prepare_json_list( value )
+ else:
+ rval[ key ] = str( value )
+ return rval
+ def exec_before_job( self, app, inp_data, out_data, param_dict=None ):
+ if param_dict is None:
+ param_dict = {}
+ json_params = {}
+ json_params[ 'param_dict' ] = self._prepare_json_param_dict( param_dict ) #it would probably be better to store the original incoming parameters here, instead of the Galaxy modified ones?
+ json_params[ 'output_data' ] = []
+ json_params[ 'job_config' ] = dict( GALAXY_DATATYPES_CONF_FILE=param_dict.get( 'GALAXY_DATATYPES_CONF_FILE' ), GALAXY_ROOT_DIR=param_dict.get( 'GALAXY_ROOT_DIR' ), TOOL_PROVIDED_JOB_METADATA_FILE=jobs.TOOL_PROVIDED_JOB_METADATA_FILE )
+ json_filename = None
+ for i, ( out_name, data ) in enumerate( out_data.iteritems() ):
+ #use wrapped dataset to access certain values
+ wrapped_data = param_dict.get( out_name )
+ #allow multiple files to be created
+ file_name = str( wrapped_data )
+ extra_files_path = str( wrapped_data.files_path )
+ data_dict = dict( out_data_name = out_name,
+ ext = data.ext,
+ dataset_id = data.dataset.id,
+ hda_id = data.id,
+ file_name = file_name,
+ extra_files_path = extra_files_path )
+ json_params[ 'output_data' ].append( data_dict )
+ if json_filename is None:
+ json_filename = file_name
+ out = open( json_filename, 'w' )
+ out.write( simplejson.dumps( json_params ) )
+ out.close()
+
+class DataSourceTool( OutputParameterJSONTool ):
"""
Alternate implementation of Tool for data_source tools -- those that
allow the user to query and extract data from another web site.
@@ -2853,29 +2943,10 @@
def _build_GALAXY_URL_parameter( self ):
return ToolParameter.build( self, ElementTree.XML( '<param name="GALAXY_URL" type="baseurl" value="/tool_runner?tool_id=%s" />' % self.id ) )
def parse_inputs( self, root ):
- Tool.parse_inputs( self, root )
+ super( DataSourceTool, self ).parse_inputs( root )
if 'GALAXY_URL' not in self.inputs:
self.inputs[ 'GALAXY_URL' ] = self._build_GALAXY_URL_parameter()
- def _prepare_datasource_json_list( self, param_list ):
- rval = []
- for value in param_list:
- if isinstance( value, dict ):
- rval.append( self._prepare_datasource_json_param_dict( value ) )
- elif isinstance( value, list ):
- rval.append( self._prepare_datasource_json_list( value ) )
- else:
- rval.append( str( value ) )
- return rval
- def _prepare_datasource_json_param_dict( self, param_dict ):
- rval = {}
- for key, value in param_dict.iteritems():
- if isinstance( value, dict ):
- rval[ key ] = self._prepare_datasource_json_param_dict( value )
- elif isinstance( value, list ):
- rval[ key ] = self._prepare_datasource_json_list( value )
- else:
- rval[ key ] = str( value )
- return rval
+ self.inputs_by_page[0][ 'GALAXY_URL' ] = self.inputs[ 'GALAXY_URL' ]
def exec_before_job( self, app, inp_data, out_data, param_dict=None ):
if param_dict is None:
param_dict = {}
@@ -2885,7 +2956,7 @@
name = param_dict.get( 'name' )
json_params = {}
- json_params[ 'param_dict' ] = self._prepare_datasource_json_param_dict( param_dict ) #it would probably be better to store the original incoming parameters here, instead of the Galaxy modified ones?
+ json_params[ 'param_dict' ] = self._prepare_json_param_dict( param_dict ) #it would probably be better to store the original incoming parameters here, instead of the Galaxy modified ones?
json_params[ 'output_data' ] = []
json_params[ 'job_config' ] = dict( GALAXY_DATATYPES_CONF_FILE=param_dict.get( 'GALAXY_DATATYPES_CONF_FILE' ), GALAXY_ROOT_DIR=param_dict.get( 'GALAXY_ROOT_DIR' ), TOOL_PROVIDED_JOB_METADATA_FILE=jobs.TOOL_PROVIDED_JOB_METADATA_FILE )
json_filename = None
@@ -2976,9 +3047,59 @@
class GenomeIndexTool( Tool ):
tool_type = 'index_genome'
+class DataManagerTool( OutputParameterJSONTool ):
+ tool_type = 'manage_data'
+ default_tool_action = DataManagerToolAction
+
+ def __init__( self, config_file, root, app, guid=None, data_manager_id=None, **kwds ):
+ self.data_manager_id = data_manager_id
+ super( DataManagerTool, self ).__init__( config_file, root, app, guid=guid, **kwds )
+ if self.data_manager_id is None:
+ self.data_manager_id = self.id
+
+ def exec_after_process( self, app, inp_data, out_data, param_dict, job = None, **kwds ):
+ #run original exec_after_process
+ super( DataManagerTool, self ).exec_after_process( app, inp_data, out_data, param_dict, job = job, **kwds )
+ #process results of tool
+ if job and job.state == job.states.ERROR:
+ return
+ data_manager_id = job.data_manager_association.data_manager_id
+ data_manager = self.app.data_managers.get_manager( data_manager_id, None )
+ assert data_manager is not None, "Invalid data manager (%s) requested. It may have been removed before the job completed." % ( data_manager_id )
+ data_manager.process_result( out_data )
+
+ def get_default_history_by_trans( self, trans, create=False ):
+ def _create_data_manager_history( user ):
+ history = trans.app.model.History( name='Data Manager History (automatically created)', user=user )
+ data_manager_association = trans.app.model.DataManagerHistoryAssociation( user=user, history=history )
+ trans.sa_session.add_all( ( history, data_manager_association ) )
+ trans.sa_session.flush()
+ return history
+ user = trans.user
+ assert user, 'You must be logged in to use this tool.'
+ history = user.data_manager_histories
+ if not history:
+ #create
+ if create:
+ history = _create_data_manager_history( user )
+ else:
+ history = None
+ else:
+ for history in reversed( history ):
+ history = history.history
+ if not history.deleted:
+ break
+ if history.deleted:
+ if create:
+ history = _create_data_manager_history( user )
+ else:
+ history = None
+ return history
+
+
# Populate tool_type to ToolClass mappings
tool_types = {}
-for tool_class in [ Tool, DataDestinationTool, SetMetadataTool, DataSourceTool, AsyncDataSourceTool ]:
+for tool_class in [ Tool, DataDestinationTool, SetMetadataTool, DataSourceTool, AsyncDataSourceTool, DataManagerTool ]:
tool_types[ tool_class.tool_type ] = tool_class
# ---- Utility classes to be factored out -----------------------------------
@@ -3020,6 +3141,8 @@
"""
def __nonzero__( self ):
return bool( self.value )
+ def get_display_text( self, quote=True ):
+ return pipes.quote( self.input.value_to_display_text( self.value, self.input.tool.app ) )
class RawObjectWrapper( ToolParameterValueWrapper ):
"""
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/tools/actions/__init__.py
--- a/lib/galaxy/tools/actions/__init__.py
+++ b/lib/galaxy/tools/actions/__init__.py
@@ -168,7 +168,7 @@
# Set history.
if not history:
- history = trans.history
+ history = tool.get_default_history_by_trans( trans, create=True )
out_data = odict()
# Collect any input datasets from the incoming parameters
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/tools/actions/data_manager.py
--- /dev/null
+++ b/lib/galaxy/tools/actions/data_manager.py
@@ -0,0 +1,17 @@
+from __init__ import DefaultToolAction
+
+import logging
+log = logging.getLogger( __name__ )
+
+class DataManagerToolAction( DefaultToolAction ):
+ """Tool action used for Data Manager Tools"""
+
+ def execute( self, tool, trans, **kwds ):
+ rval = super( DataManagerToolAction, self ).execute( tool, trans, **kwds )
+ if isinstance( rval, tuple ) and len( rval ) == 2 and isinstance( rval[0], trans.app.model.Job ):
+ assoc = trans.app.model.DataManagerJobAssociation( job=rval[0], data_manager_id=tool.data_manager_id )
+ trans.sa_session.add( assoc )
+ trans.sa_session.flush()
+ else:
+ log.error( "Got bad return value from DefaultToolAction.execute(): %s" % ( rval ) )
+ return rval
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -130,6 +130,8 @@
def __init__( self, config_element, tool_data_path ):
self.name = config_element.get( 'name' )
self.comment_char = config_element.get( 'comment_char' )
+ self.empty_field_value = config_element.get( 'empty_field_value', '' )
+ self.empty_field_values = {}
for file_elem in config_element.findall( 'file' ):
# There should only be one file_elem.
if 'path' in file_elem.attrib:
@@ -139,6 +141,8 @@
self.tool_data_file = None
self.tool_data_path = tool_data_path
self.missing_index_file = None
+ def get_empty_field_by_name( self, name ):
+ return self.empty_field_values.get( name, self.empty_field_value )
class TabularToolDataTable( ToolDataTable ):
"""
@@ -182,6 +186,7 @@
if os.path.exists( filename ):
found = True
all_rows.extend( self.parse_file_fields( open( filename ) ) )
+ self.filename = filename
else:
# Since the path attribute can include a hard-coded path to a specific directory
# (e.g., <file path="tool-data/cg_crr_files.loc" />) which may not be the same value
@@ -193,6 +198,7 @@
if os.path.exists( corrected_filename ):
found = True
all_rows.extend( self.parse_file_fields( open( corrected_filename ) ) )
+ self.filename = corrected_filename
if not found:
self.missing_index_file = filename
log.warn( "Cannot find index file '%s' for tool data table '%s'" % ( filename, self.name ) )
@@ -231,6 +237,9 @@
self.columns[name] = index
if index > self.largest_index:
self.largest_index = index
+ empty_field_value = column_elem.get( 'empty_field_value', None )
+ if empty_field_value is not None:
+ self.empty_field_values[ name ] = empty_field_value
assert 'value' in self.columns, "Required 'value' column missing from column def"
if 'name' not in self.columns:
self.columns['name'] = self.columns['value']
@@ -257,7 +266,20 @@
"'%s' characters must be used to separate fields):\n%s"
% ( ( i + 1 ), self.name, separator_char, line ) )
return rval
-
+
+ def get_column_name_list( self ):
+ rval = []
+ for i in range( self.largest_index + 1 ):
+ found_column = False
+ for name, index in self.columns.iteritems():
+ if index == i:
+ rval.append( name )
+ found_column = True
+ break
+ if not found_column:
+ rval.append( None )
+ return rval
+
def get_entry( self, query_attr, query_val, return_attr ):
"""
Returns table entry associated with a col/val pair.
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/tools/data_manager/__init__.py
--- /dev/null
+++ b/lib/galaxy/tools/data_manager/__init__.py
@@ -0,0 +1,3 @@
+"""
+Data Manager
+"""
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/tools/data_manager/manager.py
--- /dev/null
+++ b/lib/galaxy/tools/data_manager/manager.py
@@ -0,0 +1,305 @@
+import pkg_resources
+
+pkg_resources.require( "simplejson" )
+
+import os, shutil, errno
+import simplejson
+
+from galaxy import util
+from galaxy.util.odict import odict
+from galaxy.util.template import fill_template
+from galaxy.tools.data import TabularToolDataTable
+
+#set up logger
+import logging
+log = logging.getLogger( __name__ )
+
+SUPPORTED_DATA_TABLE_TYPES = ( TabularToolDataTable )
+
+class DataManagers( object ):
+ def __init__( self, app, xml_filename=None ):
+ self.app = app
+ self.data_managers = odict()
+ self.managed_data_tables = odict()
+ self.tool_path = None
+ self.filename = xml_filename or self.app.config.data_manager_config_file
+ self.load_from_xml( self.filename )
+ if self.app.config.shed_data_manager_config_file:
+ self.load_from_xml( self.app.config.shed_data_manager_config_file, store_tool_path=False )
+ def load_from_xml( self, xml_filename, store_tool_path=True ):
+ try:
+ tree = util.parse_xml( xml_filename )
+ except Exception, e:
+ log.error( 'There was an error parsing your Data Manager config file "%s": %s' % ( xml_filename, e ) )
+ return #we are not able to load any data managers
+ root = tree.getroot()
+ if root.tag != 'data_managers':
+ log.error( 'A data managers configuration must have a "data_managers" tag as the root. "%s" is present' % ( root.tag ) )
+ return
+ if store_tool_path:
+ tool_path = root.get( 'tool_path', None )
+ if tool_path is None:
+ tool_path = self.app.config.tool_path
+ if not tool_path:
+ tool_path = '.'
+ self.tool_path = tool_path
+ for data_manager_elem in root.findall( 'data_manager' ):
+ self.load_manager_from_elem( data_manager_elem )
+ def load_manager_from_elem( self, data_manager_elem, tool_path=None, add_manager=True ):
+ try:
+ data_manager = DataManager( self, data_manager_elem, tool_path=tool_path )
+ except Exception, e:
+ log.error( "Error loading data_manager '%s':\n%s" % ( e, util.xml_to_string( data_manager_elem ) ) )
+ return None
+ if add_manager:
+ self.add_manager( data_manager )
+ log.debug( 'Loaded Data Manager: %s' % ( data_manager.id ) )
+ return data_manager
+ def add_manager( self, data_manager ):
+ assert data_manager.id not in self.data_managers, "A data manager has been defined twice: %s" % ( data_manager.id )
+ self.data_managers[ data_manager.id ] = data_manager
+ for data_table_name in data_manager.data_tables.keys():
+ if data_table_name not in self.managed_data_tables:
+ self.managed_data_tables[ data_table_name ] = []
+ self.managed_data_tables[ data_table_name ].append( data_manager )
+ def get_manager( self, *args, **kwds ):
+ return self.data_managers.get( *args, **kwds )
+ def remove_manager( self, manager_id ):
+ data_manager = self.get_manager( manager_id, None )
+ if data_manager is not None:
+ del self.data_managers[ manager_id ]
+ #remove tool from toolbox
+ if data_manager.tool:
+ self.app.toolbox.remove_tool_by_id( data_manager.tool.id )
+ #determine if any data_tables are no longer tracked
+ for data_table_name in data_manager.data_tables.keys():
+ remove_data_table_tracking = True
+ for other_data_manager in self.data_managers.itervalues():
+ if data_table_name in other_data_manager.data_tables:
+ remove_data_table_tracking = False
+ break
+ if remove_data_table_tracking and data_table_name in self.managed_data_tables:
+ del self.managed_data_tables[ data_table_name ]
+
+class DataManager( object ):
+ def __init__( self, data_managers, elem=None, tool_path=None ):
+ self.data_managers = data_managers
+ self.declared_id = None
+ self.name = None
+ self.description = None
+ self.tool = None
+ self.tool_guid = None
+ self.data_tables = odict()
+ self.output_ref_by_data_table = {}
+ self.move_by_data_table_column = {}
+ self.value_translation_by_data_table_column = {}
+ if elem is not None:
+ self.load_from_element( elem, tool_path or self.data_managers.tool_path )
+ def load_from_element( self, elem, tool_path ):
+ assert elem.tag == 'data_manager', 'A data manager configuration must have a "data_manager" tag as the root. "%s" is present' % ( root.tag )
+ self.declared_id = elem.get( 'id', None )
+ path = elem.get( 'tool_file', None )
+ if path is None:
+ tool_elem = elem.find( 'tool' )
+ assert tool_elem is not None, "Error loading tool for data manager. Make sure that a tool_file attribute or a tool tag set has been defined:\n%s" % ( util.xml_to_string( elem ) )
+ path = tool_elem.get( "file", None )
+ self.tool_guid = tool_elem.get( "guid", None )
+ #use shed_conf_file to determine tool_path
+ shed_conf_file = elem.get( "shed_conf_file", None )
+ if shed_conf_file:
+ shed_conf = self.data_managers.app.toolbox.get_shed_config_dict_by_filename( shed_conf_file, None )
+ if shed_conf:
+ tool_path = shed_conf.get( "tool_path", tool_path )
+ assert path is not None, "A tool file path could not be determined:\n%s" % ( util.xml_to_string( elem ) )
+ self.load_tool( os.path.join( tool_path, path ), guid=self.tool_guid, data_manager_id=self.id )
+ self.name = elem.get( 'name', self.tool.name )
+ self.description = elem.get( 'description', self.tool.description )
+
+ for data_table_elem in elem.findall( 'data_table' ):
+ data_table_name = data_table_elem.get( "name" )
+ assert data_table_name is not None, "A name is required for a data table entry"
+ if data_table_name not in self.data_tables:
+ self.data_tables[ data_table_name ] = odict()#{}
+ output_elem = data_table_elem.find( 'output' )
+ if output_elem is not None:
+ for column_elem in output_elem.findall( 'column' ):
+ column_name = column_elem.get( 'name', None )
+ assert column_name is not None, "Name is required for column entry"
+ data_table_coumn_name = column_elem.get( 'data_table_name', column_name )
+ self.data_tables[ data_table_name ][ data_table_coumn_name ] = column_name
+ output_ref = column_elem.get( 'output_ref', None )
+ if output_ref is not None:
+ if data_table_name not in self.output_ref_by_data_table:
+ self.output_ref_by_data_table[ data_table_name ] = {}
+ self.output_ref_by_data_table[ data_table_name ][ data_table_coumn_name ] = output_ref
+ value_translation_elem = column_elem.find( 'value_translation' )
+ if value_translation_elem is not None:
+ value_translation = value_translation_elem.text
+ else:
+ value_translation = None
+ if value_translation is not None:
+ if data_table_name not in self.value_translation_by_data_table_column:
+ self.value_translation_by_data_table_column[ data_table_name ] = {}
+ self.value_translation_by_data_table_column[ data_table_name ][ data_table_coumn_name ] = value_translation
+
+ for move_elem in column_elem.findall( 'move' ):
+ move_type = move_elem.get( 'type', 'directory' )
+ relativize_symlinks = move_elem.get( 'relativize_symlinks', False ) #TODO: should we instead always relativize links?
+ source_elem = move_elem.find( 'source' )
+ if source_elem is None:
+ source_base = None
+ source_value = ''
+ else:
+ source_base = source_elem.get( 'base', None )
+ source_value = source_elem.text
+ target_elem = move_elem.find( 'target' )
+ if target_elem is None:
+ target_base = None
+ target_value = ''
+ else:
+ target_base = target_elem.get( 'base', None )
+ target_value = target_elem.text
+ if data_table_name not in self.move_by_data_table_column:
+ self.move_by_data_table_column[ data_table_name ] = {}
+ self.move_by_data_table_column[ data_table_name ][ data_table_coumn_name ] = dict( type=move_type, source_base=source_base, source_value=source_value, target_base=target_base, target_value=target_value, relativize_symlinks=relativize_symlinks )
+ @property
+ def id( self ):
+ return self.tool_guid or self.declared_id #if we have a tool with a guid, we will use that as the tool_manager id
+ def load_tool( self, tool_filename, guid=None, data_manager_id=None ):
+ tool = self.data_managers.app.toolbox.load_tool( tool_filename, guid=guid, data_manager_id=data_manager_id )
+ self.data_managers.app.toolbox.data_manager_tools[ tool.id ] = tool
+ self.data_managers.app.toolbox.tools_by_id[ tool.id ] = tool
+ self.tool = tool
+ return tool
+
+ def process_result( self, out_data ):
+ data_manager_dicts = {}
+ data_manager_dict = {}
+ #TODO: fix this merging below
+ for output_name, output_dataset in out_data.iteritems():
+ try:
+ output_dict = simplejson.loads( open( output_dataset.file_name ).read() )
+ except Exception, e:
+ log.warning( 'Error reading DataManagerTool json for "%s": %s' % ( output_name, e ) )
+ continue
+ data_manager_dicts[ output_name ] = output_dict
+ for key, value in output_dict.iteritems():
+ if key not in data_manager_dict:
+ data_manager_dict[ key ] = {}
+ data_manager_dict[ key ].update( value )
+ data_manager_dict.update( output_dict )
+
+ data_tables_dict = data_manager_dict.get( 'data_tables', {} )
+ for data_table_name, data_table_columns in self.data_tables.iteritems():
+ data_table_values = data_tables_dict.pop( data_table_name, None )
+ if not data_table_values:
+ log.warning( 'No values for data table "%s" were returned by the data manager "%s".' % ( data_table_name, self.id ) )
+ continue #next data table
+ data_table = self.data_managers.app.tool_data_tables.get( data_table_name, None )
+ if data_table is None:
+ log.error( 'The data manager "%s" returned an unknown data table "%s" with new entries "%s". These entries will not be created. Please confirm that an entry for "%s" exists in your "%s" file.' % ( self.id, data_table_name, data_table_values, data_table_name, 'tool_data_table_conf.xml' ) )
+ continue #next table name
+ if not isinstance( data_table, SUPPORTED_DATA_TABLE_TYPES ):
+ log.error( 'The data manager "%s" returned an unsupported data table "%s" with type "%s" with new entries "%s". These entries will not be created. Please confirm that the data table is of a supported type (%s).' % ( self.id, data_table_name, type( data_table ), data_table_values, SUPPORTED_DATA_TABLE_TYPES ) )
+ continue #next table name
+ output_ref_values = {}
+ if data_table_name in self.output_ref_by_data_table:
+ for data_table_column, output_ref in self.output_ref_by_data_table[ data_table_name ].iteritems():
+ output_ref_dataset = out_data.get( output_ref, None )
+ assert output_ref_dataset is not None, "Referenced output was not found."
+ output_ref_values[ data_table_column ] = output_ref_dataset
+
+ final_data_table_values = []
+ if not isinstance( data_table_values, list ):
+ data_table_values = [ data_table_values ]
+ columns = data_table.get_column_name_list()
+ #FIXME: Need to lock these files for editing
+ try:
+ data_table_fh = open( data_table.filename, 'r+b' )
+ except IOError, e:
+ log.warning( 'Error opening data table file (%s) with r+b, assuming file does not exist and will open as wb: %s' % ( data_table.filename, e ) )
+ data_table_fh = open( data_table.filename, 'wb' )
+ if os.stat( data_table.filename )[6] != 0:
+ # ensure last existing line ends with new line
+ data_table_fh.seek( -1, 2 ) #last char in file
+ last_char = data_table_fh.read()
+ if last_char not in [ '\n', '\r' ]:
+ data_table_fh.write( '\n' )
+ for data_table_row in data_table_values:
+ data_table_value = dict( **data_table_row ) #keep original values here
+ for name, value in data_table_row.iteritems(): #FIXME: need to loop through here based upon order listed in data_manager config
+ if name in output_ref_values:
+ moved = self.process_move( data_table_name, name, output_ref_values[ name ].extra_files_path, **data_table_value )
+ data_table_value[ name ] = self.process_value_translation( data_table_name, name, **data_table_value )
+ final_data_table_values.append( data_table_value )
+ fields = []
+ for column_name in columns:
+ if column_name is None or column_name not in data_table_value:
+ fields.append( data_table.get_empty_field_by_name( column_name ) )
+ else:
+ fields.append( data_table_value[ column_name ] )
+ #should we add a comment to file about automatically generated value here?
+ data_table_fh.write( "%s\n" % ( data_table.separator.join( self._replace_field_separators( fields, separator=data_table.separator ) ) ) ) #write out fields to disk
+ data_table.data.append( fields ) #add fields to loaded data table
+ data_table_fh.close()
+ for data_table_name, data_table_values in data_tables_dict.iteritems():
+ #tool returned extra data table entries, but data table was not declared in data manager
+ #do not add these values, but do provide messages
+ log.warning( 'The data manager "%s" returned an undeclared data table "%s" with new entries "%s". These entries will not be created. Please confirm that an entry for "%s" exists in your "%s" file.' % ( self.id, data_table_name, data_table_values, data_table_name, self.data_managers.filename ) )
+ def _replace_field_separators( self, fields, separator="\t", replace=None, comment_char=None ):
+ #make sure none of the fields contain separator
+ #make sure separator replace is different from comment_char,
+ #due to possible leading replace
+ if replace is None:
+ if separator == " ":
+ if comment_char == "\t":
+ replace = "_"
+ else:
+ replace = "\t"
+ else:
+ if comment_char == " ":
+ replace = "_"
+ else:
+ replace = " "
+ return map( lambda x: x.replace( separator, replace ), fields )
+ def process_move( self, data_table_name, column_name, source_base_path, relative_symlinks=False, **kwd ):
+ if data_table_name in self.move_by_data_table_column and column_name in self.move_by_data_table_column[ data_table_name ]:
+ move_dict = self.move_by_data_table_column[ data_table_name ][ column_name ]
+ source = move_dict[ 'source_base' ]
+ if source is None:
+ source = source_base_path
+ else:
+ source = fill_template( source, GALAXY_DATA_MANAGER_DATA_PATH=self.data_managers.app.config.galaxy_data_manager_data_path, **kwd )
+ if move_dict[ 'source_value' ]:
+ source = os.path.join( source, fill_template( move_dict[ 'source_value' ], GALAXY_DATA_MANAGER_DATA_PATH=self.data_managers.app.config.galaxy_data_manager_data_path, **kwd ) )
+ target = move_dict[ 'target_base' ]
+ if target is None:
+ target = self.data_managers.app.config.galaxy_data_manager_data_path
+ else:
+ target = fill_template( target, GALAXY_DATA_MANAGER_DATA_PATH=self.data_managers.app.config.galaxy_data_manager_data_path, **kwd )
+ if move_dict[ 'target_value' ]:
+ target = os.path.join( target, fill_template( move_dict[ 'target_value' ], GALAXY_DATA_MANAGER_DATA_PATH=self.data_managers.app.config.galaxy_data_manager_data_path, **kwd ) )
+
+ if move_dict[ 'type' ] == 'file':
+ dirs, filename = os.path.split( target )
+ try:
+ os.makedirs( dirs )
+ except OSError, e:
+ if e.errno != errno.EEXIST:
+ raise e
+ #log.debug( 'Error creating directory "%s": %s' % ( dirs, e ) )
+ #moving a directory and the target already exists, we move the contents instead
+ util.move_merge( source, target )
+
+ if move_dict.get( 'relativize_symlinks', False ):
+ util.relativize_symlinks( target )
+
+ return True
+ return False
+
+ def process_value_translation( self, data_table_name, column_name, **kwd ):
+ value = kwd.get( column_name )
+ if data_table_name in self.value_translation_by_data_table_column and column_name in self.value_translation_by_data_table_column[ data_table_name ]:
+ value_translation = self.value_translation_by_data_table_column[ data_table_name ][ column_name ]
+ value = fill_template( value_translation, GALAXY_DATA_MANAGER_DATA_PATH=self.data_managers.app.config.galaxy_data_manager_data_path, **kwd )
+ return value
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -880,6 +880,9 @@
>>> print p.filter_value( "hg17" )
hg17
"""
+ def __init__( self, *args, **kwds ):
+ super( GenomeBuildParameter, self ).__init__( *args, **kwds )
+ self.static_options = [ ( value, key, False ) for key, value in util.dbnames ]
def get_options( self, trans, other_values ):
if not trans.history:
yield 'unspecified', '?', False
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -578,6 +578,22 @@
return curdir
return join( *rel_list )
+def relativize_symlinks( path, start=None, followlinks=False):
+ for root, dirs, files in os.walk( path, followlinks=followlinks ):
+ rel_start = None
+ for file_name in files:
+ symlink_file_name = os.path.join( root, file_name )
+ if os.path.islink( symlink_file_name ):
+ symlink_target = os.readlink( symlink_file_name )
+ if rel_start is None:
+ if start is None:
+ rel_start = root
+ else:
+ rel_start = start
+ rel_path = relpath( symlink_target, rel_start )
+ os.remove( symlink_file_name )
+ os.symlink( rel_path, symlink_file_name )
+
def stringify_dictionary_keys( in_dict ):
#returns a new dictionary
#changes unicode keys into strings, only works on top level (does not recurse)
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/util/shed_util.py
--- a/lib/galaxy/util/shed_util.py
+++ b/lib/galaxy/util/shed_util.py
@@ -451,7 +451,7 @@
return elem_list
def generate_tool_panel_dict_for_new_install( tool_dicts, tool_section=None ):
"""
- When installing a repository that contains tools, all tools must curently be defined within the same tool section in the tool
+ When installing a repository that contains tools, all tools must currently be defined within the same tool section in the tool
panel or outside of any sections.
"""
tool_panel_dict = {}
@@ -1168,6 +1168,66 @@
return False
# Default to copying the file if none of the above are true.
return True
+def install_data_managers( app, shed_data_manager_conf_filename, metadata_dict, shed_config_dict, relative_install_dir, repository, repository_tools_tups ):
+ rval = []
+ if 'data_manager' in metadata_dict:
+ repository_tools_by_guid = {}
+ for tool_tup in repository_tools_tups:
+ repository_tools_by_guid[ tool_tup[1] ] = dict( tool_config_filename=tool_tup[0], tool=tool_tup[2] )
+ config_elems = [ elem for elem in util.parse_xml( shed_data_manager_conf_filename ).getroot() ] #load existing data managers
+
+ repo_data_manager_conf_filename = metadata_dict['data_manager'].get( 'config_filename', None )
+ if repo_data_manager_conf_filename is None:
+ log.debug( "No data_manager_conf.xml file has been defined." )
+ return rval
+ relative_repo_data_manager_dir = os.path.join( shed_config_dict.get( 'tool_path', '' ), relative_install_dir )
+ repo_data_manager_conf_filename = os.path.join( relative_repo_data_manager_dir, repo_data_manager_conf_filename )
+ tree = util.parse_xml( repo_data_manager_conf_filename )
+ root = tree.getroot()
+ for elem in root:
+ if elem.tag == 'data_manager':
+ data_manager_id = elem.get( 'id', None )
+ if data_manager_id is None:
+ log.error( "A data manager was defined that does not have an id and will not be installed:\n%s" % ( util.xml_to_string( elem ) ) )
+ continue
+ data_manager_dict = metadata_dict['data_manager'].get( 'data_managers', {} ).get( data_manager_id, None )
+ if data_manager_dict is None:
+ log.error( "Data manager metadata is not defined properly for '%s'." % ( data_manager_id ) )
+ continue
+
+ tool_guid = data_manager_dict.get( 'tool_guid', None )
+ if tool_guid is None:
+ log.error( "Data manager tool guid '%s' is not set in metadata for '%s'." % ( tool_guid, data_manager_id ) )
+ continue
+ tool_dict = repository_tools_by_guid.get( tool_guid, None )
+ if tool_dict is None:
+ log.error( "Data manager tool guid '%s' could not be found for '%s'. Perhaps the tool is invalid?" % ( tool_guid, data_manager_id ) )
+ continue
+ tool = tool_dict.get( 'tool', None )
+ if tool is None:
+ log.error( "Data manager tool with guid '%s' could not be found for '%s'. Perhaps the tool is invalid?" % ( tool_guid, data_manager_id ) )
+ continue
+ tool_config_filename = tool_dict.get( 'tool_config_filename', None )
+ if tool_config_filename is None:
+ log.error( "Data manager metadata is missing 'tool_config_file' for '%s'." % ( data_manager_id ) )
+ continue
+
+ elem.set( 'shed_conf_file', shed_config_dict['config_filename'] )
+ if elem.get( 'tool_file', None ) is not None:
+ del elem.attrib[ 'tool_file' ] #remove old tool_file info
+ tool_elem = suc.generate_tool_elem( repository.tool_shed, repository.name, repository.installed_changeset_revision,
+ repository.owner, tool_config_filename, tool, None )
+ elem.insert( 0, tool_elem )
+ data_manager = app.data_managers.load_manager_from_elem( elem, tool_path=shed_config_dict.get( 'tool_path', '' ) )
+ if data_manager:
+ rval.append( data_manager )
+ else:
+ log.warning( "Encountered unexpected element '%s':\n%s" % ( elem.tag, util.xml_to_string( elem ) ) )
+ config_elems.append( elem )
+ # Persist the altered shed_tool_config file.
+ suc.data_manager_config_elems_to_xml_file( app, config_elems, shed_data_manager_conf_filename )
+ return rval
+
def is_in_repo_info_dicts( repo_info_dict, repo_info_dicts ):
"""Return True if the received repo_info_dict is contained in the list of received repo_info_dicts."""
for name, repo_info_tuple in repo_info_dict.items():
@@ -1447,6 +1507,31 @@
def pull_repository( repo, repository_clone_url, ctx_rev ):
"""Pull changes from a remote repository to a local one."""
commands.pull( suc.get_configured_ui(), repo, source=repository_clone_url, rev=[ ctx_rev ] )
+def remove_from_data_manager( app, repository ):
+ metadata_dict = repository.metadata
+ if metadata_dict and 'data_manager' in metadata_dict:
+ data_manager_tool_guids = [ data_manager_dict.get( 'tool_guid' ) for data_manager_dict in metadata_dict.get( 'data_manager', {} ).get( 'data_managers', {} ).itervalues() if 'tool_guid' in data_manager_dict ]
+ shed_data_manager_conf_filename = app.config.shed_data_manager_config_file
+ tree = util.parse_xml( shed_data_manager_conf_filename )
+ root = tree.getroot()
+ assert root.tag == 'data_managers', 'The file provided (%s) for removing data managers from is not a valid data manager xml file.' % ( shed_data_manager_conf_filename )
+ config_elems = []
+ for elem in root:
+ keep_elem = True
+ if elem.tag == 'data_manager':
+ tool_elem = elem.find( 'tool' )
+ if tool_elem is not None:
+ tool_guid = tool_elem.get( 'guid', None )
+ if tool_guid in data_manager_tool_guids:
+ keep_elem = False
+ if keep_elem:
+ config_elems.append( elem )
+ #remove data manager from in memory
+ for data_manager_tool_guids in data_manager_tool_guids:
+ #for shed-based data managers, the data_manager id is the same as the tool guid
+ app.data_managers.remove_manager( data_manager_tool_guids )
+ # Persist the altered shed_tool_config file.
+ suc.data_manager_config_elems_to_xml_file( app, config_elems, shed_data_manager_conf_filename )
def remove_from_shed_tool_config( trans, shed_tool_conf_dict, guids_to_remove ):
# A tool shed repository is being uninstalled so change the shed_tool_conf file. Parse the config file to generate the entire list
# of config_elems instead of using the in-memory list since it will be a subset of the entire list if one or more repositories have
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/util/shed_util_common.py
--- a/lib/galaxy/util/shed_util_common.py
+++ b/lib/galaxy/util/shed_util_common.py
@@ -30,6 +30,7 @@
log = logging.getLogger( __name__ )
INITIAL_CHANGELOG_HASH = '000000000000'
+REPOSITORY_DATA_MANAGER_CONFIG_FILENAME = "data_manager_conf.xml"
# Characters that must be html escaped
MAPPED_CHARS = { '>' :'>',
'<' :'<',
@@ -37,9 +38,10 @@
'&' : '&',
'\'' : ''' }
MAX_CONTENT_SIZE = 32768
-NOT_TOOL_CONFIGS = [ 'datatypes_conf.xml', 'repository_dependencies.xml', 'tool_dependencies.xml' ]
+NOT_TOOL_CONFIGS = [ 'datatypes_conf.xml', 'repository_dependencies.xml', 'tool_dependencies.xml', REPOSITORY_DATA_MANAGER_CONFIG_FILENAME ]
GALAXY_ADMIN_TOOL_SHED_CONTROLLER = 'GALAXY_ADMIN_TOOL_SHED_CONTROLLER'
TOOL_SHED_ADMIN_CONTROLLER = 'TOOL_SHED_ADMIN_CONTROLLER'
+TOOL_TYPES_NOT_IN_TOOL_PANEL = [ 'manage_data' ]
VALID_CHARS = set( string.letters + string.digits + "'\"-=_.()/+*^,:?!#[]%\\$@;{}" )
new_repo_email_alert_template = """
@@ -981,6 +983,14 @@
repository_dependencies,
tool_dependencies )
return repo_info_dict
+def data_manager_config_elems_to_xml_file( app, config_elems, config_filename ):#, shed_tool_conf_filename ):
+ # Persist the current in-memory list of config_elems to a file named by the value of config_filename.
+ fh = open( config_filename, 'wb' )
+ fh.write( '<?xml version="1.0"?>\n<data_managers>\n' )#% ( shed_tool_conf_filename ))
+ for elem in config_elems:
+ fh.write( util.xml_to_string( elem, pretty=True ) )
+ fh.write( '</data_managers>\n' )
+ fh.close()
def ensure_required_repositories_exist_for_reinstall( trans, repository_dependencies ):
"""
Inspect the received repository_dependencies dictionary and make sure tool_shed_repository objects exist in the database for each entry. These
@@ -1008,6 +1018,64 @@
return '%s://%s%s/repos/%s/%s' % ( protocol, username, base, repository.user.username, repository.name )
else:
return '%s/repos/%s/%s' % ( base_url, repository.user.username, repository.name )
+def generate_data_manager_metadata( app, repository, repo_dir, data_manager_config_filename, metadata_dict, shed_config_dict=None ):
+ """Update the received metadata_dict with information from the parsed data_manager_config_filename."""
+ if data_manager_config_filename is None:
+ return metadata_dict
+ try:
+ tree = util.parse_xml( data_manager_config_filename )
+ except Exception, e:
+ log.error( 'There was an error parsing your Data Manager config file "%s": %s' % ( data_manager_config_filename, e ) )
+ return metadata_dict #we are not able to load any data managers
+ tool_path = None
+ if shed_config_dict:
+ tool_path = shed_config_dict.get( 'tool_path', None )
+ tools = {}
+ for tool in metadata_dict.get( 'tools', [] ):
+ tool_conf_name = tool['tool_config']
+ if tool_path:
+ tool_conf_name = os.path.join( tool_path, tool_conf_name )
+ tools[tool_conf_name] = tool
+ repo_path = repository.repo_path( app )
+ try:
+ repo_files_directory = repository.repo_files_directory( app )
+ repo_dir = repo_files_directory
+ except AttributeError:
+ repo_files_directory = repo_path
+ relative_data_manager_dir = util.relpath( os.path.split( data_manager_config_filename )[0], repo_dir )
+ rel_data_manager_config_filename = os.path.join( relative_data_manager_dir, os.path.split( data_manager_config_filename )[1] )
+ data_managers = {}
+ data_manager_metadata = { 'config_filename': rel_data_manager_config_filename, 'data_managers': data_managers }#'tool_config_files': tool_files }
+ metadata_dict[ 'data_manager' ] = data_manager_metadata
+ root = tree.getroot()
+ data_manager_tool_path = root.get( 'tool_path', None )
+ if data_manager_tool_path:
+ relative_data_manager_dir = os.path.join( relative_data_manager_dir, data_manager_tool_path )
+ for data_manager_elem in root.findall( 'data_manager' ):
+ tool_file = data_manager_elem.get( 'tool_file', None )
+ data_manager_id = data_manager_elem.get( 'id', None )
+ if data_manager_id is None:
+ log.error( 'Data Manager entry is missing id attribute in "%s".' % ( data_manager_config_filename ) )
+ continue
+ data_tables = []
+ if tool_file is None:
+ log.error( 'Data Manager entry is missing tool_file attribute in "%s".' % ( data_manager_config_filename ) )
+ else:
+ for data_table_elem in data_manager_elem.findall( 'data_table' ):
+ data_table_name = data_table_elem.get( 'name', None )
+ if data_table_name is None:
+ log.error( 'Data Manager data_table entry is name attribute in "%s".' % ( data_manager_config_filename ) )
+ else:
+ data_tables.append( data_table_name )
+ data_manager_metadata_tool_file = os.path.join( relative_data_manager_dir, tool_file )
+ tool_metadata_tool_file = os.path.join( repo_files_directory, data_manager_metadata_tool_file )
+ tool = tools.get( tool_metadata_tool_file, None )
+ if tool is None:
+ log.error( "Unable to determine tools metadata for '%s'." % ( data_manager_metadata_tool_file ) )
+ continue
+ data_managers[ data_manager_id ] = { 'tool_config_file': data_manager_metadata_tool_file, 'data_tables': data_tables, 'tool_guid': tool['guid'] }
+ log.debug( 'Loaded Data Manager tool_files: %s' % ( tool_file ) )
+ return metadata_dict
def generate_datatypes_metadata( datatypes_config, metadata_dict ):
"""Update the received metadata_dict with information from the parsed datatypes_config."""
tree = ElementTree.parse( datatypes_config )
@@ -1252,6 +1320,9 @@
exported_workflow_dict = json.from_json_string( workflow_text )
if 'a_galaxy_workflow' in exported_workflow_dict and exported_workflow_dict[ 'a_galaxy_workflow' ] == 'true':
metadata_dict = generate_workflow_metadata( relative_path, exported_workflow_dict, metadata_dict )
+ # Handle any data manager entries
+ metadata_dict = generate_data_manager_metadata( app, repository, files_dir, get_config_from_disk( REPOSITORY_DATA_MANAGER_CONFIG_FILENAME, files_dir ), metadata_dict, shed_config_dict=shed_config_dict )
+
if readme_files:
metadata_dict[ 'readme_files' ] = readme_files
# This step must be done after metadata for tools has been defined.
@@ -1531,6 +1602,7 @@
description=tool.description,
version_string_cmd = tool.version_string_cmd,
tool_config=tool_config,
+ tool_type=tool.tool_type,
requirements=tool_requirements,
tests=tool_tests )
if 'tools' in metadata_dict:
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
--- a/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
+++ b/lib/galaxy/webapps/galaxy/controllers/admin_toolshed.py
@@ -507,6 +507,8 @@
if tool_shed_repository.includes_tools:
# Handle tool panel alterations.
shed_util.remove_from_tool_panel( trans, tool_shed_repository, shed_tool_conf, uninstall=remove_from_disk_checked )
+ if tool_shed_repository.includes_data_managers:
+ shed_util.remove_from_data_manager( trans.app, tool_shed_repository )
if tool_shed_repository.includes_datatypes:
# Deactivate proprietary datatypes.
installed_repository_dict = shed_util.load_installed_datatypes( trans.app, tool_shed_repository, repository_install_dir, deactivate=True )
@@ -701,6 +703,9 @@
shed_tool_conf=shed_tool_conf,
tool_panel_dict=tool_panel_dict,
new_install=True )
+ if 'data_manager' in metadata_dict:
+ rval = shed_util.install_data_managers( trans.app, trans.app.config.shed_data_manager_config_file, metadata_dict, shed_config_dict, relative_install_dir,
+ tool_shed_repository, repository_tools_tups )
if 'datatypes' in metadata_dict:
tool_shed_repository.status = trans.model.ToolShedRepository.installation_status.LOADING_PROPRIETARY_DATATYPES
if not tool_shed_repository.includes_datatypes:
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 lib/galaxy/webapps/galaxy/controllers/tool_runner.py
--- a/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
+++ b/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
@@ -89,7 +89,7 @@
tool.input_translator.translate( params )
# We may be visiting Galaxy for the first time ( e.g., sending data from UCSC ),
# so make sure to create a new history if we've never had one before.
- history = trans.get_history( create=True )
+ history = tool.get_default_history_by_trans( trans, create=True )
template, vars = tool.handle_input( trans, params.__dict__ )
if len( params ) > 0:
trans.log_event( "Tool params: %s" % ( str( params ) ), tool_id=tool_id )
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 run.sh
--- a/run.sh
+++ b/run.sh
@@ -16,6 +16,8 @@
shed_tool_data_table_conf.xml.sample
tool_data_table_conf.xml.sample
tool_sheds_conf.xml.sample
+ data_manager_conf.xml.sample
+ shed_data_manager_conf.xml.sample
openid_conf.xml.sample
universe_wsgi.ini.sample
tool-data/shared/ncbi/builds.txt.sample
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 shed_data_manager_conf.xml.sample
--- /dev/null
+++ b/shed_data_manager_conf.xml.sample
@@ -0,0 +1,3 @@
+<?xml version="1.0"?>
+<data_managers>
+</data_managers>
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 templates/webapps/galaxy/admin/index.mako
--- a/templates/webapps/galaxy/admin/index.mako
+++ b/templates/webapps/galaxy/admin/index.mako
@@ -59,6 +59,7 @@
%if trans.app.config.enable_beta_job_managers:
<div class="toolTitle"><a href="${h.url_for( controller='data_admin', action='manage_data' )}" target="galaxy_main">Manage local data</a></div>
%endif
+ <div class="toolTitle"><a href="${h.url_for( controller='data_manager' )}" target="galaxy_main">Manage local data (beta)</a></div></div></div><div class="toolSectionPad"></div>
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 templates/webapps/galaxy/data_manager/index.mako
--- /dev/null
+++ b/templates/webapps/galaxy/data_manager/index.mako
@@ -0,0 +1,57 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+
+<%def name="title()">Data Manager</%def>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<h2>Data Manager</h2>
+
+%if view_only:
+ <p>Not implemented</p>
+%else:
+ <p>Choose your data managing option from below.</p>
+ <ul>
+ <li><strong>Access data managers</strong> - get data, build indexes, etc
+ <p/>
+ <ul>
+ %for data_manager_id, data_manager in data_managers.data_managers.iteritems():
+ <li>
+ <a href="${ h.url_for( 'tool_runner?tool_id=%s' % ( data_manager.tool.id ) ) }"><strong>${ data_manager.name | h }</strong></a> - ${ data_manager.description | h }
+ </li>
+ <p/>
+ %endfor
+ </ul>
+ </li>
+ <p/>
+ <li><strong>View managed data by manager</strong>
+ <p/>
+ <ul>
+ %for data_manager_id, data_manager in data_managers.data_managers.iteritems():
+ <li>
+ <a href="${h.url_for( controller='data_manager', action='manage_data_manager', id=data_manager_id)}" target="galaxy_main"><strong>${ data_manager.name | h }</strong></a> - ${ data_manager.description | h }</a>
+ </li>
+ <p/>
+ %endfor
+ </ul>
+ </li>
+ <p/>
+ <p/>
+ <li><strong>View managed data by Tool Data Table</strong>
+ <p/>
+ <ul>
+ %for table_name, managers in data_managers.managed_data_tables.iteritems():
+ <li>
+ <a href="${h.url_for( controller='data_manager', action='manage_data_table', table_name=table_name)}" target="galaxy_main"><strong>${ table_name | h }</strong></a>
+ </li>
+ <p/>
+ %endfor
+ </ul>
+ </li>
+ <p/>
+ </ul>
+ <p/>
+ <br/>
+%endif
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 templates/webapps/galaxy/data_manager/manage_data_manager.mako
--- /dev/null
+++ b/templates/webapps/galaxy/data_manager/manage_data_manager.mako
@@ -0,0 +1,50 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+
+<%def name="title()">Data Manager: ${ data_manager.name | h } - ${ data_manager.description | h }</%def>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<h2>Data Manager: ${ data_manager.name | h } - ${ data_manager.description | h }</h2>
+
+%if view_only:
+ <p>Not implemented</p>
+%else:
+ <p>Access managed data by job</p>
+
+%if jobs:
+<form name="jobs" action="${h.url_for()}" method="POST">
+ <table class="manage-table colored" border="0" cellspacing="0" cellpadding="0" width="100%">
+ <tr class="header">
+ <td>Job ID</td>
+ <td>User</td>
+ <td>Last Update</td>
+ <td>State</td>
+ <td>Command Line</td>
+ <td>Job Runner</td>
+ <td>PID/Cluster ID</td>
+ </tr>
+ %for job in jobs:
+ <td><a href="${ h.url_for( controller="data_manager", action="view_job", id=trans.security.encode_id( job.id ) ) }">${ job.id | h }</a></td>
+ %if job.history and job.history.user:
+ <td>${job.history.user.email | h}</td>
+ %else:
+ <td>anonymous</td>
+ %endif
+ <td>${job.update_time | h}</td>
+ <td>${job.state | h}</td>
+ <td>${job.command_line | h}</td>
+ <td>${job.job_runner_name | h}</td>
+ <td>${job.job_runner_external_id | h}</td>
+ </tr>
+ %endfor
+ </table>
+ <p/>
+</form>
+%else:
+ <div class="infomessage">There are no jobs for this data manager.</div>
+%endif
+
+%endif
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 templates/webapps/galaxy/data_manager/manage_data_table.mako
--- /dev/null
+++ b/templates/webapps/galaxy/data_manager/manage_data_table.mako
@@ -0,0 +1,36 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+
+<%def name="title()">Data Table Manager: ${ data_table.name | h }</%def>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+%if view_only:
+ <p>Not implemented</p>
+%else:
+<% column_name_list = data_table.get_column_name_list() %>
+<table class="tabletip">
+ <thead>
+ <tr><th colspan="${len (column_name_list) }" style="font-size: 120%;">
+ Data Manager: ${ data_table.name | h }
+ </th></tr>
+ <tr>
+
+ %for name in column_name_list:
+ <th>${name | h}</th>
+ %endfor
+ </tr>
+ </thead>
+ <tbody>
+ %for table_row in data_table.data:
+ <tr>
+ %for field in table_row:
+ <td>${field | h}</td>
+ %endfor
+ </tr>
+ %endfor
+</table>
+
+%endif
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 templates/webapps/galaxy/data_manager/view_job.mako
--- /dev/null
+++ b/templates/webapps/galaxy/data_manager/view_job.mako
@@ -0,0 +1,57 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+<% from galaxy.util import nice_size %>
+
+<%def name="title()">Data Manager: ${ data_manager.name | h } - ${ data_manager.description | h }</%def>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+%if view_only:
+ <p>Not implemented</p>
+%else:
+%for i, hda in enumerate( hdas ):
+<table class="tabletip">
+ <thead>
+ <tr><th colspan="2" style="font-size: 120%;">
+ Data Manager: ${ data_manager.name | h } - ${ data_manager.description | h }
+ </th></tr>
+ </thead>
+ <tbody>
+ <tr><td>Name:</td><td>${hda.name | h}</td></tr>
+ <tr><td>Created:</td><td>${hda.create_time.strftime("%b %d, %Y")}</td></tr>
+ <tr><td>Filesize:</td><td>${nice_size(hda.dataset.file_size)}</td></tr>
+ <tr><td>Tool Exit Code:</td><td>${job.exit_code | h}</td></tr>
+ <tr><td>Full Path:</td><td>${hda.file_name | h}</td></tr>
+ <tr><td>View complete info:</td><td><a href="${h.url_for( controller='dataset', action='show_params', dataset_id=trans.security.encode_id( hda.id ))}">${ hda.id | h }</a></td></tr>
+
+</table>
+<br />
+
+<% json_tables = data_manager_output[i]%>
+%for table_name, json_table in json_tables:
+<table class="tabletip">
+ <thead>
+ <tr><th colspan="2" style="font-size: 120%;">
+ Data Table: ${ table_name | h }
+ </th></tr>
+ </thead>
+ <% len_json_table = len( json_table ) %>
+ %for j, table_row in enumerate( json_table ):
+ <tbody>
+ %if len_json_table > 1:
+ <tr><td><strong>Entry #${j}</strong></td><td></td></tr>
+ %endif
+ %for name, value in table_row.iteritems():
+ <tr><td>${name | h}:</td><td>${value | h}</td></tr>
+ %endfor
+ %endfor
+ </tbody>
+</table>
+<br />
+%endfor
+
+%endfor
+
+%endif
diff -r db08c095de0c249f3a6cde62f254c104de1fb6ea -r 138642a8dc55689a3cab1d070aac72b293265bc9 universe_wsgi.ini.sample
--- a/universe_wsgi.ini.sample
+++ b/universe_wsgi.ini.sample
@@ -573,6 +573,16 @@
# Details" option in the history. Administrators can always see this.
#expose_dataset_path = False
+# Data manager configuration options
+# Allow non-admin users to view available Data Manager options
+#enable_data_manager_user_view = False
+# File where Data Managers are configured
+#data_manager_config_file = data_manager_conf.xml
+# File where Tool Shed based Data Managers are configured
+#shed_data_manager_config_file = shed_data_manager_conf.xml
+# Directory to store Data Manager based tool-data; defaults to tool_data_path
+#galaxy_data_manager_data_path = tool-data
+
# -- Job Execution
# To increase performance of job execution and the web interface, you can
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Grid framework fixes, including extension of sanitize_text to work with lists of text.
by Bitbucket 14 Feb '13
by Bitbucket 14 Feb '13
14 Feb '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/db08c095de0c/
changeset: db08c095de0c
user: jgoecks
date: 2013-02-14 19:48:52
summary: Grid framework fixes, including extension of sanitize_text to work with lists of text.
affected #: 3 files
diff -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 -r db08c095de0c249f3a6cde62f254c104de1fb6ea lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -196,7 +196,18 @@
return text
def sanitize_text(text):
- """Restricts the characters that are allowed in a text"""
+ """
+ Restricts the characters that are allowed in text; accepts both strings
+ and lists of strings.
+ """
+ if isinstance( text, basestring ):
+ return _sanitize_text_helper(text)
+ elif isinstance( text, list ):
+ return [ _sanitize_text_helper(t) for t in text ]
+
+def _sanitize_text_helper(text):
+ """Restricts the characters that are allowed in a string"""
+
out = []
for c in text:
if c in valid_chars:
diff -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 -r db08c095de0c249f3a6cde62f254c104de1fb6ea static/scripts/galaxy.grids.js
--- a/static/scripts/galaxy.grids.js
+++ b/static/scripts/galaxy.grids.js
@@ -48,7 +48,7 @@
// Update URL arg with new condition.
if (append) {
// Update or append value.
- var cur_val = this.attributes.key,
+ var cur_val = this.attributes.filters[key],
new_val;
if (cur_val === null || cur_val === undefined) {
new_val = value;
diff -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 -r db08c095de0c249f3a6cde62f254c104de1fb6ea static/scripts/packed/galaxy.grids.js
--- a/static/scripts/packed/galaxy.grids.js
+++ b/static/scripts/packed/galaxy.grids.js
@@ -1,1 +1,1 @@
-jQuery.ajaxSettings.traditional=true;$(document).ready(function(){init_grid_elements();init_grid_controls();$("input[type=text]").each(function(){$(this).click(function(){$(this).select()}).keyup(function(){$(this).css("font-style","normal")})})});var Grid=Backbone.Model.extend({defaults:{url_base:"",async:false,async_ops:[],categorical_filters:[],filters:{},sort_key:null,show_item_checkboxes:false,cur_page:1,num_pages:1,operation:undefined,item_ids:undefined},can_async_op:function(a){return _.indexOf(this.attributes.async_ops,a)!==-1},add_filter:function(e,f,b){if(b){var c=this.attributes.key,a;if(c===null||c===undefined){a=f}else{if(typeof(c)=="string"){if(c=="All"){a=f}else{var d=[];d[0]=c;d[1]=f;a=d}}else{a=c;a.push(f)}}this.attributes.filters[e]=a}else{this.attributes.filters[e]=f}},remove_filter:function(b,e){var a=this.attributes.filters[b];if(a===null||a===undefined){return false}var d=true;if(typeof(a)==="string"){if(a=="All"){d=false}else{delete this.attributes.filters[b]}}else{var c=_.indexOf(a,e);if(c!==-1){a.splice(c,1)}else{d=false}}return d},get_url_data:function(){var a={async:this.attributes.async,sort:this.attributes.sort_key,page:this.attributes.cur_page,show_item_checkboxes:this.attributes.show_item_checkboxes};if(this.attributes.operation){a.operation=this.attributes.operation}if(this.attributes.item_ids){a.id=this.attributes.item_ids}var b=this;_.each(_.keys(b.attributes.filters),function(c){a["f-"+c]=b.attributes.filters[c]});return a}});function init_operation_buttons(){$("input[name=operation]:submit").each(function(){$(this).click(function(){var b=$(this).val();var a=[];$("input[name=id]:checked").each(function(){a.push($(this).val())});do_operation(b,a)})})}function init_grid_controls(){init_operation_buttons();$(".submit-image").each(function(){$(this).mousedown(function(){$(this).addClass("gray-background")});$(this).mouseup(function(){$(this).removeClass("gray-background")})});$(".sort-link").each(function(){$(this).click(function(){set_sort_condition($(this).attr("sort_key"));return false})});$(".page-link > a").each(function(){$(this).click(function(){set_page($(this).attr("page_num"));return false})});$(".categorical-filter > a").each(function(){$(this).click(function(){set_categorical_filter($(this).attr("filter_key"),$(this).attr("filter_val"));return false})});$(".text-filter-form").each(function(){$(this).submit(function(){var d=$(this).attr("column_key");var c=$("#input-"+d+"-filter");var e=c.val();c.val("");add_filter_condition(d,e,true);return false})});var a=$("#input-tags-filter");if(a.length){a.autocomplete(history_tag_autocomplete_url,{selectFirst:false,autoFill:false,highlight:false,mustMatch:false})}var b=$("#input-name-filter");if(b.length){b.autocomplete(history_name_autocomplete_url,{selectFirst:false,autoFill:false,highlight:false,mustMatch:false})}$(".advanced-search-toggle").each(function(){$(this).click(function(){$("#standard-search").slideToggle("fast");$("#advanced-search").slideToggle("fast");return false})})}function init_grid_elements(){$(".grid").each(function(){var b=$(this).find("input.grid-row-select-checkbox");var a=$(this).find("span.grid-selected-count");var c=function(){a.text($(b).filter(":checked").length)};$(b).each(function(){$(this).change(c)});c()});$(".label").each(function(){var a=$(this).attr("href");if(a!==undefined&&a.indexOf("operation=")!=-1){$(this).click(function(){do_operation_from_href($(this).attr("href"));return false})}});$(".community_rating_star").rating({});make_popup_menus()}function go_page_one(){var a=grid.get("cur_page");if(a!==null&&a!==undefined&&a!=="all"){grid.set("cur_page",1)}}function add_filter_condition(c,e,a){if(e===""){return false}grid.add_filter(c,e,a);var d=$("<span>"+e+"<a href='javascript:void(0);'><span class='delete-search-icon' /></a></span>");d.addClass("text-filter-val");d.click(function(){grid.remove_filter(c,e);$(this).remove();go_page_one();update_grid()});var b=$("#"+c+"-filtering-criteria");b.append(d);go_page_one();update_grid()}function add_tag_to_grid_filter(c,b){var a=c+(b!==undefined&&b!==""?":"+b:"");$("#advanced-search").show("fast");add_filter_condition("tags",a,true)}function set_sort_condition(f){var e=grid.get("sort_key");var d=f;if(e.indexOf(f)!==-1){if(e.substring(0,1)!=="-"){d="-"+f}else{}}$(".sort-arrow").remove();var c=(d.substring(0,1)=="-")?"↑":"↓";var a=$("<span>"+c+"</span>").addClass("sort-arrow");var b=$("#"+f+"-header");b.append(a);grid.set("sort_key",d);go_page_one();update_grid()}function set_categorical_filter(b,d){var a=grid.get("categorical_filters")[b],c=grid.get("filters")[b];$("."+b+"-filter").each(function(){var h=$.trim($(this).text());var f=a[h];var g=f[b];if(g==d){$(this).empty();$(this).addClass("current-filter");$(this).append(h)}else{if(g==c){$(this).empty();var e=$("<a href='#'>"+h+"</a>");e.click(function(){set_categorical_filter(b,g)});$(this).removeClass("current-filter");$(this).append(e)}}});grid.add_filter(b,d);go_page_one();update_grid()}function set_page(a){$(".page-link").each(function(){var g=$(this).attr("id"),e=parseInt(g.split("-")[2],10),c=grid.get("cur_page"),f;if(e===a){f=$(this).children().text();$(this).empty();$(this).addClass("inactive-link");$(this).text(f)}else{if(e===c){f=$(this).text();$(this).empty();$(this).removeClass("inactive-link");var d=$("<a href='#'>"+f+"</a>");d.click(function(){set_page(e)});$(this).append(d)}}});var b=true;if(a==="all"){grid.set("cur_page",a);b=false}else{grid.set("cur_page",parseInt(a,10))}update_grid(b)}function do_operation(b,a){b=b.toLowerCase();grid.set({operation:b,item_ids:a});if(grid.can_async_op(b)){update_grid(true)}else{go_to_URL()}}function do_operation_from_href(c){var f=c.split("?");if(f.length>1){var a=f[1];var e=a.split("&");var b=null;var g=-1;for(var d=0;d<e.length;d++){if(e[d].indexOf("operation")!=-1){b=e[d].split("=")[1]}else{if(e[d].indexOf("id")!=-1){g=e[d].split("=")[1]}}}do_operation(b,g);return false}}function go_to_URL(){grid.set("async",false);window.location=grid.get("url_base")+"?"+$.param(grid.get_url_data())}function update_grid(a){if(!grid.get("async")){go_to_URL();return}var b=(grid.get("operation")?"POST":"GET");$(".loading-elt-overlay").show();$.ajax({type:b,url:grid.get("url_base"),data:grid.get_url_data(),error:function(){alert("Grid refresh failed")},success:function(d){var c=d.split("*****");$("#grid-table-body").html(c[0]);$("#grid-table-footer").html(c[1]);$("#grid-table-body").trigger("update");init_grid_elements();init_operation_buttons();make_popup_menus();$(".loading-elt-overlay").hide();var e=$.trim(c[2]);if(e!==""){$("#grid-message").html(e).show();setTimeout(function(){$("#grid-message").hide()},5000)}},complete:function(){grid.set({operation:undefined,item_ids:undefined})}})}function check_all_items(){var a=document.getElementById("check_all"),b=document.getElementsByTagName("input"),d=0,c;if(a.checked===true){for(c=0;c<b.length;c++){if(b[c].name.indexOf("id")!==-1){b[c].checked=true;d++}}}else{for(c=0;c<b.length;c++){if(b[c].name.indexOf("id")!==-1){b[c].checked=false}}}init_grid_elements()};
\ No newline at end of file
+jQuery.ajaxSettings.traditional=true;$(document).ready(function(){init_grid_elements();init_grid_controls();$("input[type=text]").each(function(){$(this).click(function(){$(this).select()}).keyup(function(){$(this).css("font-style","normal")})})});var Grid=Backbone.Model.extend({defaults:{url_base:"",async:false,async_ops:[],categorical_filters:[],filters:{},sort_key:null,show_item_checkboxes:false,cur_page:1,num_pages:1,operation:undefined,item_ids:undefined},can_async_op:function(a){return _.indexOf(this.attributes.async_ops,a)!==-1},add_filter:function(e,f,b){if(b){var c=this.attributes.filters[e],a;if(c===null||c===undefined){a=f}else{if(typeof(c)=="string"){if(c=="All"){a=f}else{var d=[];d[0]=c;d[1]=f;a=d}}else{a=c;a.push(f)}}this.attributes.filters[e]=a}else{this.attributes.filters[e]=f}},remove_filter:function(b,e){var a=this.attributes.filters[b];if(a===null||a===undefined){return false}var d=true;if(typeof(a)==="string"){if(a=="All"){d=false}else{delete this.attributes.filters[b]}}else{var c=_.indexOf(a,e);if(c!==-1){a.splice(c,1)}else{d=false}}return d},get_url_data:function(){var a={async:this.attributes.async,sort:this.attributes.sort_key,page:this.attributes.cur_page,show_item_checkboxes:this.attributes.show_item_checkboxes};if(this.attributes.operation){a.operation=this.attributes.operation}if(this.attributes.item_ids){a.id=this.attributes.item_ids}var b=this;_.each(_.keys(b.attributes.filters),function(c){a["f-"+c]=b.attributes.filters[c]});return a}});function init_operation_buttons(){$("input[name=operation]:submit").each(function(){$(this).click(function(){var b=$(this).val();var a=[];$("input[name=id]:checked").each(function(){a.push($(this).val())});do_operation(b,a)})})}function init_grid_controls(){init_operation_buttons();$(".submit-image").each(function(){$(this).mousedown(function(){$(this).addClass("gray-background")});$(this).mouseup(function(){$(this).removeClass("gray-background")})});$(".sort-link").each(function(){$(this).click(function(){set_sort_condition($(this).attr("sort_key"));return false})});$(".page-link > a").each(function(){$(this).click(function(){set_page($(this).attr("page_num"));return false})});$(".categorical-filter > a").each(function(){$(this).click(function(){set_categorical_filter($(this).attr("filter_key"),$(this).attr("filter_val"));return false})});$(".text-filter-form").each(function(){$(this).submit(function(){var d=$(this).attr("column_key");var c=$("#input-"+d+"-filter");var e=c.val();c.val("");add_filter_condition(d,e,true);return false})});var a=$("#input-tags-filter");if(a.length){a.autocomplete(history_tag_autocomplete_url,{selectFirst:false,autoFill:false,highlight:false,mustMatch:false})}var b=$("#input-name-filter");if(b.length){b.autocomplete(history_name_autocomplete_url,{selectFirst:false,autoFill:false,highlight:false,mustMatch:false})}$(".advanced-search-toggle").each(function(){$(this).click(function(){$("#standard-search").slideToggle("fast");$("#advanced-search").slideToggle("fast");return false})})}function init_grid_elements(){$(".grid").each(function(){var b=$(this).find("input.grid-row-select-checkbox");var a=$(this).find("span.grid-selected-count");var c=function(){a.text($(b).filter(":checked").length)};$(b).each(function(){$(this).change(c)});c()});$(".label").each(function(){var a=$(this).attr("href");if(a!==undefined&&a.indexOf("operation=")!=-1){$(this).click(function(){do_operation_from_href($(this).attr("href"));return false})}});$(".community_rating_star").rating({});make_popup_menus()}function go_page_one(){var a=grid.get("cur_page");if(a!==null&&a!==undefined&&a!=="all"){grid.set("cur_page",1)}}function add_filter_condition(c,e,a){if(e===""){return false}grid.add_filter(c,e,a);var d=$("<span>"+e+"<a href='javascript:void(0);'><span class='delete-search-icon' /></a></span>");d.addClass("text-filter-val");d.click(function(){grid.remove_filter(c,e);$(this).remove();go_page_one();update_grid()});var b=$("#"+c+"-filtering-criteria");b.append(d);go_page_one();update_grid()}function add_tag_to_grid_filter(c,b){var a=c+(b!==undefined&&b!==""?":"+b:"");$("#advanced-search").show("fast");add_filter_condition("tags",a,true)}function set_sort_condition(f){var e=grid.get("sort_key");var d=f;if(e.indexOf(f)!==-1){if(e.substring(0,1)!=="-"){d="-"+f}else{}}$(".sort-arrow").remove();var c=(d.substring(0,1)=="-")?"↑":"↓";var a=$("<span>"+c+"</span>").addClass("sort-arrow");var b=$("#"+f+"-header");b.append(a);grid.set("sort_key",d);go_page_one();update_grid()}function set_categorical_filter(b,d){var a=grid.get("categorical_filters")[b],c=grid.get("filters")[b];$("."+b+"-filter").each(function(){var h=$.trim($(this).text());var f=a[h];var g=f[b];if(g==d){$(this).empty();$(this).addClass("current-filter");$(this).append(h)}else{if(g==c){$(this).empty();var e=$("<a href='#'>"+h+"</a>");e.click(function(){set_categorical_filter(b,g)});$(this).removeClass("current-filter");$(this).append(e)}}});grid.add_filter(b,d);go_page_one();update_grid()}function set_page(a){$(".page-link").each(function(){var g=$(this).attr("id"),e=parseInt(g.split("-")[2],10),c=grid.get("cur_page"),f;if(e===a){f=$(this).children().text();$(this).empty();$(this).addClass("inactive-link");$(this).text(f)}else{if(e===c){f=$(this).text();$(this).empty();$(this).removeClass("inactive-link");var d=$("<a href='#'>"+f+"</a>");d.click(function(){set_page(e)});$(this).append(d)}}});var b=true;if(a==="all"){grid.set("cur_page",a);b=false}else{grid.set("cur_page",parseInt(a,10))}update_grid(b)}function do_operation(b,a){b=b.toLowerCase();grid.set({operation:b,item_ids:a});if(grid.can_async_op(b)){update_grid(true)}else{go_to_URL()}}function do_operation_from_href(c){var f=c.split("?");if(f.length>1){var a=f[1];var e=a.split("&");var b=null;var g=-1;for(var d=0;d<e.length;d++){if(e[d].indexOf("operation")!=-1){b=e[d].split("=")[1]}else{if(e[d].indexOf("id")!=-1){g=e[d].split("=")[1]}}}do_operation(b,g);return false}}function go_to_URL(){grid.set("async",false);window.location=grid.get("url_base")+"?"+$.param(grid.get_url_data())}function update_grid(a){if(!grid.get("async")){go_to_URL();return}var b=(grid.get("operation")?"POST":"GET");$(".loading-elt-overlay").show();$.ajax({type:b,url:grid.get("url_base"),data:grid.get_url_data(),error:function(){alert("Grid refresh failed")},success:function(d){var c=d.split("*****");$("#grid-table-body").html(c[0]);$("#grid-table-footer").html(c[1]);$("#grid-table-body").trigger("update");init_grid_elements();init_operation_buttons();make_popup_menus();$(".loading-elt-overlay").hide();var e=$.trim(c[2]);if(e!==""){$("#grid-message").html(e).show();setTimeout(function(){$("#grid-message").hide()},5000)}},complete:function(){grid.set({operation:undefined,item_ids:undefined})}})}function check_all_items(){var a=document.getElementById("check_all"),b=document.getElementsByTagName("input"),d=0,c;if(a.checked===true){for(c=0;c<b.length;c++){if(b[c].name.indexOf("id")!==-1){b[c].checked=true;d++}}}else{for(c=0;c<b.length;c++){if(b[c].name.indexOf("id")!==-1){b[c].checked=false}}}init_grid_elements()};
\ No newline at end of file
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: inithello: Refactor the tool shed functional tests' upload_file method for clarity.
by Bitbucket 13 Feb '13
by Bitbucket 13 Feb '13
13 Feb '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/3ffe47cea647/
changeset: 3ffe47cea647
user: inithello
date: 2013-02-14 05:36:18
summary: Refactor the tool shed functional tests' upload_file method for clarity.
affected #: 34 files
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/base/twilltestcase.py
--- a/test/tool_shed/base/twilltestcase.py
+++ b/test/tool_shed/base/twilltestcase.py
@@ -168,11 +168,11 @@
raise AssertionError( errmsg )
def create_category( self, **kwd ):
category = test_db_util.get_category_by_name( kwd[ 'name' ] )
- if category is not None:
- return category
- self.visit_url( '/admin/manage_categories?operation=create' )
- self.submit_form( form_no=1, button="create_category_button", **kwd )
- return test_db_util.get_category_by_name( kwd[ 'name' ] )
+ if category is None:
+ self.visit_url( '/admin/manage_categories?operation=create' )
+ self.submit_form( form_no=1, button="create_category_button", **kwd )
+ category = test_db_util.get_category_by_name( kwd[ 'name' ] )
+ return category
def create_checkbox_query_string( self, field_name, value ):
'''
From galaxy.web.form_builder.CheckboxField:
@@ -213,8 +213,13 @@
dependency_description=dependency_description )
self.upload_file( repository,
'repository_dependencies.xml',
- filepath=filepath,
- commit_message='Uploaded dependency on %s.' % ', '.join( repo.name for repo in depends_on ) )
+ filepath=filepath,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on %s.' % ', '.join( repo.name for repo in depends_on ),
+ strings_displayed=[],
+ strings_not_displayed=[] )
def create_repository_review( self, repository, review_contents_dict, changeset_revision=None, copy_from=None):
strings_displayed = []
if not copy_from:
@@ -860,17 +865,35 @@
def upload_file( self,
repository,
filename,
- filepath=None,
- valid_tools_only=True,
+ filepath,
+ valid_tools_only,
+ uncompress_file,
+ remove_repo_files_not_in_tar,
+ commit_message,
strings_displayed=[],
- strings_not_displayed=[],
- **kwd ):
+ strings_not_displayed=[] ):
+ removed_message = 'files were removed from the repository'
+ if remove_repo_files_not_in_tar:
+ if not self.repository_is_new( repository ):
+ if removed_message not in strings_displayed:
+ strings_displayed.append( removed_message )
+ else:
+ if removed_message not in strings_not_displayed:
+ strings_not_displayed.append( removed_message )
self.visit_url( '/upload/upload?repository_id=%s' % self.security.encode_id( repository.id ) )
if valid_tools_only:
strings_displayed.extend( [ 'has been successfully', 'uploaded to the repository.' ] )
- for key in kwd:
- tc.fv( "1", key, kwd[ key ] )
tc.formfile( "1", "file_data", self.get_filename( filename, filepath ) )
+ if uncompress_file:
+ tc.fv( 1, 'uncompress_file', 'Yes' )
+ else:
+ tc.fv( 1, 'uncompress_file', 'No' )
+ if not self.repository_is_new( repository ):
+ if remove_repo_files_not_in_tar:
+ tc.fv( 1, 'remove_repo_files_not_in_tar', 'Yes' )
+ else:
+ tc.fv( 1, 'remove_repo_files_not_in_tar', 'No' )
+ tc.fv( 1, 'commit_message', commit_message )
tc.submit( "upload_button" )
self.check_for_strings( strings_displayed, strings_not_displayed )
# Uncomment this if it becomes necessary to wait for an asynchronous process to complete after submitting an upload.
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0000_basic_repository_features.py
--- a/test/tool_shed/functional/test_0000_basic_repository_features.py
+++ b/test/tool_shed/functional/test_0000_basic_repository_features.py
@@ -56,7 +56,15 @@
def test_0030_upload_filtering_1_1_0( self ):
"""Upload filtering_1.1.0.tar to the repository"""
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
- self.upload_file( repository, 'filtering/filtering_1.1.0.tar', commit_message="Uploaded filtering 1.1.0" )
+ self.upload_file( repository,
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=True,
+ commit_message="Uploaded filtering 1.1.0",
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0035_verify_repository( self ):
'''Display basic repository pages'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
@@ -102,15 +110,27 @@
'''Upload filtering.txt file associated with tool version 1.1.0.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- 'filtering/filtering_0000.txt',
- commit_message="Uploaded filtering.txt",
- uncompress_file='No',
- remove_repo_files_not_in_tar='No' )
+ filename='filtering/filtering_0000.txt',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message="Uploaded filtering.txt",
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.display_manage_repository_page( repository, strings_displayed=[ 'Readme file for filtering 1.1.0' ] )
def test_0055_upload_filtering_test_data( self ):
'''Upload filtering test data.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
- self.upload_file( repository, 'filtering/filtering_test_data.tar', commit_message="Uploaded filtering test data", remove_repo_files_not_in_tar='No' )
+ self.upload_file( repository,
+ filename='filtering/filtering_test_data.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message="Uploaded filtering test data",
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.display_repository_file_contents( repository=repository,
filename='1.bed',
filepath='test-data',
@@ -121,9 +141,14 @@
'''Upload filtering version 2.2.0'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- 'filtering/filtering_2.2.0.tar',
- commit_message="Uploaded filtering 2.2.0",
- remove_repo_files_not_in_tar='No' )
+ filename='filtering/filtering_2.2.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message="Uploaded filtering 2.2.0",
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0065_verify_filtering_repository( self ):
'''Verify the new tool versions and repository metadata.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
@@ -138,7 +163,15 @@
def test_0070_upload_readme_txt_file( self ):
'''Upload readme.txt file associated with tool version 2.2.0.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
- self.upload_file( repository, 'readme.txt', commit_message="Uploaded readme.txt" )
+ self.upload_file( repository,
+ filename='readme.txt',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message="Uploaded readme.txt",
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.display_manage_repository_page( repository, strings_displayed=[ 'This is a readme file.' ] )
# Verify that there is a different readme file for each metadata revision.
metadata_revisions = self.get_repository_metadata_revisions( repository )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0010_repository_with_tool_dependencies.py
--- a/test/tool_shed/functional/test_0010_repository_with_tool_dependencies.py
+++ b/test/tool_shed/functional/test_0010_repository_with_tool_dependencies.py
@@ -34,10 +34,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'freebayes/freebayes.xml',
+ filename='freebayes/freebayes.xml',
+ filepath=None,
valid_tools_only=False,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded the tool xml.',
strings_displayed=[ 'Metadata may have been defined', 'This file requires an entry', 'tool_data_table_conf' ],
- commit_message='Uploaded the tool xml.' )
+ strings_not_displayed=[] )
self.display_manage_repository_page( repository, strings_displayed=[ 'Invalid tools' ], strings_not_displayed=[ 'Valid tools' ] )
tip = self.get_repository_tip( repository )
self.check_repository_invalid_tools_for_changeset_revision( repository,
@@ -47,10 +51,14 @@
'''Upload the missing tool_data_table_conf.xml.sample file to the repository.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- 'freebayes/tool_data_table_conf.xml.sample',
+ filename='freebayes/tool_data_table_conf.xml.sample',
+ filepath=None,
valid_tools_only=False,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded the tool data table sample file.',
strings_displayed=[],
- commit_message='Uploaded the tool data table sample file.' )
+ strings_not_displayed=[] )
self.display_manage_repository_page( repository, strings_displayed=[ 'Invalid tools' ], strings_not_displayed=[ 'Valid tools' ] )
tip = self.get_repository_tip( repository )
self.check_repository_invalid_tools_for_changeset_revision( repository,
@@ -60,30 +68,50 @@
'''Upload the missing sam_fa_indices.loc.sample file to the repository.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- 'freebayes/sam_fa_indices.loc.sample',
+ filename='freebayes/sam_fa_indices.loc.sample',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded tool data table .loc file.',
strings_displayed=[],
- commit_message='Uploaded tool data table .loc file.' )
+ strings_not_displayed=[] )
def test_0025_upload_malformed_tool_dependency_xml( self ):
'''Upload tool_dependencies.xml with bad characters in the readme tag.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- os.path.join( 'freebayes', 'malformed_tool_dependencies', 'tool_dependencies.xml' ),
+ filename=os.path.join( 'freebayes', 'malformed_tool_dependencies', 'tool_dependencies.xml' ),
+ filepath=None,
valid_tools_only=False,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded malformed tool dependency XML.',
strings_displayed=[ 'Exception attempting to parse tool_dependencies.xml', 'not well-formed' ],
- commit_message='Uploaded malformed tool dependency XML.' )
+ strings_not_displayed=[] )
def test_0030_upload_invalid_tool_dependency_xml( self ):
'''Upload tool_dependencies.xml defining version 0.9.5 of the freebayes package.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- os.path.join( 'freebayes', 'invalid_tool_dependencies', 'tool_dependencies.xml' ),
+ filename=os.path.join( 'freebayes', 'invalid_tool_dependencies', 'tool_dependencies.xml' ),
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded invalid tool dependency XML.',
strings_displayed=[ 'Name, version and type from a tool requirement tag does not match' ],
- commit_message='Uploaded invalid tool dependency XML.' )
+ strings_not_displayed=[] )
def test_0035_upload_valid_tool_dependency_xml( self ):
'''Upload tool_dependencies.xml defining version 0.9.4_9696d0ce8a962f7bb61c4791be5ce44312b81cf8 of the freebayes package.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- os.path.join( 'freebayes', 'tool_dependencies.xml' ),
- commit_message='Uploaded valid tool dependency XML.' )
+ filename=os.path.join( 'freebayes', 'tool_dependencies.xml' ),
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded valid tool dependency XML.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0040_verify_tool_dependencies( self ):
'''Verify that the uploaded tool_dependencies.xml specifies the correct package versions.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0020_basic_repository_dependencies.py
--- a/test/tool_shed/functional/test_0020_basic_repository_dependencies.py
+++ b/test/tool_shed/functional/test_0020_basic_repository_dependencies.py
@@ -37,7 +37,15 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
- self.upload_file( repository, 'emboss/datatypes/datatypes_conf.xml', commit_message='Uploaded datatypes_conf.xml.' )
+ self.upload_file( repository,
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0015_verify_datatypes_in_datatypes_repository( self ):
'''Verify that the emboss_datatypes repository contains datatype entries.'''
repository = test_db_util.get_repository_by_name_and_owner( datatypes_repository_name, common.test_user_1_name )
@@ -51,7 +59,15 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
- self.upload_file( repository, 'emboss/emboss.tar', commit_message='Uploaded emboss_5.tar' )
+ self.upload_file( repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss.tar',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0025_generate_and_upload_repository_dependencies_xml( self ):
'''Generate and upload the repository_dependencies.xml file'''
repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
@@ -60,9 +76,14 @@
self.generate_repository_dependency_xml( [ datatypes_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ) )
self.upload_file( repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded repository_dependencies.xml' )
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0030_verify_emboss_5_dependencies( self ):
'''Verify that the emboss_5 repository now depends on the emboss_datatypes repository with correct name, owner, and changeset revision.'''
repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0030_repository_dependency_revisions.py
--- a/test/tool_shed/functional/test_0030_repository_dependency_revisions.py
+++ b/test/tool_shed/functional/test_0030_repository_dependency_revisions.py
@@ -28,36 +28,83 @@
def test_0005_create_category( self ):
"""Create a category for this test suite"""
self.create_category( name='Test 0030 Repository Dependency Revisions', description='Testing repository dependencies by revision.' )
- def test_0010_create_repositories( self ):
- '''Create the emboss_5_0030, emboss_6_0030, emboss_datatypes_0030, and emboss_0030 repositories and populate the emboss_datatypes repository.'''
+ def test_0010_create_emboss_5_repository( self ):
+ '''Create and populate the emboss_5_0030 repository.'''
self.logout()
self.login( email=common.test_user_1_email, username=common.test_user_1_name )
category = test_db_util.get_category_by_name( 'Test 0030 Repository Dependency Revisions' )
- emboss_5_repository = self.get_or_create_repository( name=emboss_5_repository_name,
- description=emboss_repository_description,
- long_description=emboss_repository_long_description,
- owner=common.test_user_1_name,
- category_id=self.security.encode_id( category.id ) )
- emboss_6_repository = self.get_or_create_repository( name=emboss_6_repository_name,
- description=emboss_repository_description,
- long_description=emboss_repository_long_description,
- owner=common.test_user_1_name,
- category_id=self.security.encode_id( category.id ) )
- datatypes_repository = self.get_or_create_repository( name=datatypes_repository_name,
- description=emboss_repository_description,
- long_description=emboss_repository_long_description,
- owner=common.test_user_1_name,
- category_id=self.security.encode_id( category.id ) )
- emboss_repository = self.get_or_create_repository( name=emboss_repository_name,
+ repository = self.get_or_create_repository( name=emboss_5_repository_name,
+ description=emboss_repository_description,
+ long_description=emboss_repository_long_description,
+ owner=common.test_user_1_name,
+ category_id=self.security.encode_id( category.id ) )
+ self.upload_file( repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded tool tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0015_create_emboss_6_repository( self ):
+ '''Create and populate the emboss_6_0030 repository.'''
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
+ category = test_db_util.get_category_by_name( 'Test 0030 Repository Dependency Revisions' )
+ repository = self.get_or_create_repository( name=emboss_6_repository_name,
+ description=emboss_repository_description,
+ long_description=emboss_repository_long_description,
+ owner=common.test_user_1_name,
+ category_id=self.security.encode_id( category.id ) )
+ self.upload_file( repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded tool tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0020_create_emboss_datatypes_repository( self ):
+ '''Create and populate the emboss_datatypes_0030 repository.'''
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
+ category = test_db_util.get_category_by_name( 'Test 0030 Repository Dependency Revisions' )
+ repository = self.get_or_create_repository( name=datatypes_repository_name,
+ description=emboss_repository_description,
+ long_description=emboss_repository_long_description,
+ owner=common.test_user_1_name,
+ category_id=self.security.encode_id( category.id ) )
+ self.upload_file( repository,
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0025_create_emboss_repository( self ):
+ '''Create and populate the emboss_0030 repository.'''
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
+ category = test_db_util.get_category_by_name( 'Test 0030 Repository Dependency Revisions' )
+ repository = self.get_or_create_repository( name=emboss_repository_name,
description=emboss_repository_description,
long_description=emboss_repository_long_description,
owner=common.test_user_1_name,
category_id=self.security.encode_id( category.id ) )
- self.upload_file( emboss_5_repository, 'emboss/emboss.tar', commit_message='Uploaded tool tarball.' )
- self.upload_file( emboss_6_repository, 'emboss/emboss.tar', commit_message='Uploaded tool tarball.' )
- self.upload_file( datatypes_repository, 'emboss/datatypes/datatypes_conf.xml', commit_message='Uploaded datatypes_conf.xml.' )
- self.upload_file( emboss_repository, 'emboss/emboss.tar', commit_message='Uploaded tool tarball.' )
- def test_0015_generate_repository_dependencies_for_emboss_5( self ):
+ self.upload_file( repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded the tool tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0030_generate_repository_dependencies_for_emboss_5( self ):
'''Generate a repository_dependencies.xml file specifying emboss_datatypes and upload it to the emboss_5 repository.'''
datatypes_repository = test_db_util.get_repository_by_name_and_owner( datatypes_repository_name, common.test_user_1_name )
repository_dependencies_path = self.generate_temp_path( 'test_0030', additional_paths=[ 'emboss' ] )
@@ -65,18 +112,28 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ) )
emboss_5_repository = test_db_util.get_repository_by_name_and_owner( emboss_5_repository_name, common.test_user_1_name )
self.upload_file( emboss_5_repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded repository_depepndencies.xml.' )
- def test_0020_generate_repository_dependencies_for_emboss_6( self ):
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0035_generate_repository_dependencies_for_emboss_6( self ):
'''Generate a repository_dependencies.xml file specifying emboss_datatypes and upload it to the emboss_6 repository.'''
emboss_6_repository = test_db_util.get_repository_by_name_and_owner( emboss_6_repository_name, common.test_user_1_name )
repository_dependencies_path = self.generate_temp_path( 'test_0030', additional_paths=[ 'emboss' ] )
self.upload_file( emboss_6_repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded repository_depepndencies.xml.' )
- def test_0025_generate_repository_dependency_on_emboss_5( self ):
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0040_generate_repository_dependency_on_emboss_5( self ):
'''Create and upload repository_dependencies.xml for the emboss_5_0030 repository.'''
emboss_repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
emboss_5_repository = test_db_util.get_repository_by_name_and_owner( emboss_5_repository_name, common.test_user_1_name )
@@ -85,10 +142,15 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Emboss requires the Emboss 5 repository.' )
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded dependency configuration specifying emboss_5' )
- def test_0030_generate_repository_dependency_on_emboss_6( self ):
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0045_generate_repository_dependency_on_emboss_6( self ):
'''Create and upload repository_dependencies.xml for the emboss_6_0030 repository.'''
emboss_repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
emboss_6_repository = test_db_util.get_repository_by_name_and_owner( emboss_6_repository_name, common.test_user_1_name )
@@ -97,10 +159,15 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Emboss requires the Emboss 6 repository.' )
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded dependency configuration specifying emboss_6' )
- def test_0035_verify_repository_dependency_revisions( self ):
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0050_verify_repository_dependency_revisions( self ):
'''Verify that different metadata revisions of the emboss repository have different repository dependencies.'''
repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
repository_metadata = [ ( metadata.metadata, metadata.changeset_revision ) for metadata in self.get_repository_metadata( repository ) ]
@@ -120,7 +187,7 @@
self.display_manage_repository_page( repository,
changeset_revision=changeset_revision,
strings_displayed=strings_displayed )
- def test_0040_verify_repository_metadata( self ):
+ def test_0055_verify_repository_metadata( self ):
'''Verify that resetting the metadata does not change it.'''
emboss_repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
emboss_5_repository = test_db_util.get_repository_by_name_and_owner( emboss_5_repository_name, common.test_user_1_name )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0040_repository_circular_dependencies.py
--- a/test/tool_shed/functional/test_0040_repository_circular_dependencies.py
+++ b/test/tool_shed/functional/test_0040_repository_circular_dependencies.py
@@ -37,9 +37,14 @@
categories=[ 'test_0040_repository_circular_dependencies' ],
strings_displayed=[] )
self.upload_file( repository,
- 'freebayes/freebayes.tar',
+ filename='freebayes/freebayes.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded the tool tarball.',
strings_displayed=[],
- commit_message='Uploaded freebayes.tar.' )
+ strings_not_displayed=[] )
def test_0015_create_filtering_repository( self ):
'''Create and populate filtering_0040.'''
self.logout()
@@ -51,9 +56,14 @@
categories=[ 'test_0040_repository_circular_dependencies' ],
strings_displayed=[] )
self.upload_file( repository,
- 'filtering/filtering_1.1.0.tar',
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded the tool tarball for filtering 1.1.0.',
strings_displayed=[],
- commit_message='Uploaded filtering.tar.' )
+ strings_not_displayed=[] )
def test_0020_create_dependency_on_freebayes( self ):
'''Upload a repository_dependencies.xml file that specifies the current revision of freebayes to the filtering_0040 repository.'''
# The dependency structure should look like:
@@ -67,9 +77,14 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Filtering 1.1.0 depends on the freebayes repository.' )
self.upload_file( filtering_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on freebayes' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on freebayes.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0025_create_dependency_on_filtering( self ):
'''Upload a repository_dependencies.xml file that specifies the current revision of filtering to the freebayes_0040 repository.'''
# The dependency structure should look like:
@@ -83,9 +98,14 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Freebayes depends on the filtering repository.' )
self.upload_file( freebayes_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on filtering' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on filtering.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0030_verify_repository_dependencies( self ):
'''Verify that each repository can depend on the other without causing an infinite loop.'''
filtering_repository = test_db_util.get_repository_by_name_and_owner( filtering_repository_name, common.test_user_1_name )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0050_circular_dependencies_4_levels.py
--- a/test/tool_shed/functional/test_0050_circular_dependencies_4_levels.py
+++ b/test/tool_shed/functional/test_0050_circular_dependencies_4_levels.py
@@ -58,9 +58,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'convert_chars/convert_chars.tar',
+ filename='convert_chars/convert_chars.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded convert_chars tarball.',
strings_displayed=[],
- commit_message='Uploaded convert_chars.tar.' )
+ strings_not_displayed=[] )
def test_0010_create_column_repository( self ):
'''Create and populate convert_chars_0050.'''
category = self.create_category( name=category_name, description=category_description )
@@ -71,9 +76,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'column_maker/column_maker.tar',
+ filename='column_maker/column_maker.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded column_maker tarball.',
strings_displayed=[],
- commit_message='Uploaded column_maker.tar.' )
+ strings_not_displayed=[] )
def test_0015_create_emboss_datatypes_repository( self ):
'''Create and populate emboss_datatypes_0050.'''
category = self.create_category( name=category_name, description=category_description )
@@ -86,9 +96,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'emboss/datatypes/datatypes_conf.xml',
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
strings_displayed=[],
- commit_message='Uploaded datatypes_conf.xml.' )
+ strings_not_displayed=[] )
def test_0020_create_emboss_repository( self ):
'''Create and populate emboss_0050.'''
category = self.create_category( name=category_name, description=category_description )
@@ -99,18 +114,28 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'emboss/emboss.tar',
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss tarball.',
strings_displayed=[],
- commit_message='Uploaded tool tarball.' )
+ strings_not_displayed=[] )
datatypes_repository = test_db_util.get_repository_by_name_and_owner( emboss_datatypes_repository_name, common.test_user_1_name )
repository_dependencies_path = self.generate_temp_path( 'test_0050', additional_paths=[ 'emboss' ] )
self.generate_repository_dependency_xml( [ datatypes_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Emboss depends on the emboss_datatypes repository.' )
self.upload_file( repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on emboss_datatypes.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on emboss_datatypes.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0025_create_filtering_repository( self ):
'''Create and populate filtering_0050.'''
category = self.create_category( name=category_name, description=category_description )
@@ -121,18 +146,28 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( filtering_repository,
- 'filtering/filtering_1.1.0.tar',
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
strings_displayed=[],
- commit_message='Uploaded filtering.tar.' )
+ strings_not_displayed=[] )
emboss_repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
repository_dependencies_path = self.generate_temp_path( 'test_0050', additional_paths=[ 'filtering' ] )
self.generate_repository_dependency_xml( [ emboss_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Filtering depends on the emboss repository.' )
self.upload_file( filtering_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on emboss.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on emboss.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0030_create_freebayes_repository( self ):
'''Create and populate freebayes_0050.'''
category = self.create_category( name=category_name, description=category_description )
@@ -143,9 +178,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'freebayes/freebayes.tar',
+ filename='freebayes/freebayes.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded freebayes tarball.',
strings_displayed=[],
- commit_message='Uploaded freebayes.tar.' )
+ strings_not_displayed=[] )
def test_0035_create_bismark_repository( self ):
'''Create and populate bismark_0050.'''
category = self.create_category( name=category_name, description=category_description )
@@ -156,10 +196,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'bismark/bismark.tar',
+ filename='bismark/bismark.tar',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded bismark tarball.',
strings_displayed=[],
- valid_tools_only=False,
- commit_message='Uploaded bismark.tar.' )
+ strings_not_displayed=[] )
def test_0040_create_and_upload_dependency_definitions( self ):
column_repository = test_db_util.get_repository_by_name_and_owner( column_repository_name, common.test_user_1_name )
convert_repository = test_db_util.get_repository_by_name_and_owner( convert_repository_name, common.test_user_1_name )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0060_workflows.py
--- a/test/tool_shed/functional/test_0060_workflows.py
+++ b/test/tool_shed/functional/test_0060_workflows.py
@@ -6,6 +6,8 @@
repository_long_description="Long description of Galaxy's filtering tool for test 0060"
workflow_filename = 'Workflow_for_0060_filter_workflow_repository.ga'
workflow_name = 'Workflow for 0060_filter_workflow_repository'
+category_name = 'Test 0060 Workflow Features'
+category_description = 'Test 0060 for workflow features'
class TestToolShedWorkflowFeatures( ShedTwillTestCase ):
'''Test valid and invalid workflows.'''
@@ -26,13 +28,14 @@
self.create_category( name='Test 0060 Workflow Features', description='Test 0060 - Workflow Features' )
def test_0010_create_repository( self ):
"""Create and populate the filtering repository"""
+ category = self.create_category( name=category_name, description=category_description )
self.logout()
self.login( email=common.test_user_1_email, username=common.test_user_1_name )
self.get_or_create_repository( name=repository_name,
description=repository_description,
long_description=repository_long_description,
owner=common.test_user_1_name,
- categories=[ 'Test 0060 Workflow Features' ],
+ category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
def test_0015_upload_workflow( self ):
'''Upload a workflow with a missing tool, and verify that the tool specified is marked as missing.'''
@@ -44,18 +47,27 @@
os.makedirs( workflow_filepath )
file( os.path.join( workflow_filepath, workflow_filename ), 'w+' ).write( workflow )
self.upload_file( repository,
- workflow_filename,
+ filename=workflow_filename,
filepath=workflow_filepath,
- commit_message='Uploaded filtering workflow.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering workflow.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.load_workflow_image_in_tool_shed( repository, workflow_name, strings_displayed=[ '#EBBCB2' ] )
def test_0020_upload_tool( self ):
'''Upload the missing tool for the workflow in the previous step, and verify that the error is no longer present.'''
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
self.upload_file( repository,
- 'filtering/filtering_2.2.0.tar',
- commit_message="Uploaded filtering 2.2.0",
- remove_repo_files_not_in_tar='No' )
-# raise Exception( self.get_repository_tip( repository ) )
+ filename='filtering/filtering_2.2.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 2.2.0.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.load_workflow_image_in_tool_shed( repository, workflow_name, strings_not_displayed=[ '#EBBCB2' ] )
def test_0025_verify_repository_metadata( self ):
'''Verify that resetting the metadata does not change it.'''
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0070_invalid_tool.py
--- a/test/tool_shed/functional/test_0070_invalid_tool.py
+++ b/test/tool_shed/functional/test_0070_invalid_tool.py
@@ -5,6 +5,7 @@
repository_description = "Galaxy's bismark wrapper"
repository_long_description = "Long description of Galaxy's bismark wrapper"
category_name = 'Test 0070 Invalid Tool Revisions'
+category_description = 'Tests for a repository with invalid tool revisions.'
class TestBismarkRepository( ShedTwillTestCase ):
'''Testing bismark with valid and invalid tool entries.'''
@@ -22,7 +23,7 @@
admin_user_private_role = test_db_util.get_private_role( admin_user )
def test_0005_create_category_and_repository( self ):
"""Create a category for this test suite, then create and populate a bismark repository. It should contain at least one each valid and invalid tool."""
- category = self.create_category( name=category_name, description='Tests for a repository with invalid tool revisions.' )
+ category = self.create_category( name=category_name, description=category_description )
self.logout()
self.login( email=common.test_user_1_email, username=common.test_user_1_name )
repository = self.get_or_create_repository( name=repository_name,
@@ -32,18 +33,25 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'bismark/bismark.tar',
+ filename='bismark/bismark.tar',
+ filepath=None,
valid_tools_only=False,
- strings_displayed=[],
- commit_message='Uploaded the tool tarball.' )
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded bismark tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.display_manage_repository_page( repository, strings_displayed=[ 'Invalid tools' ] )
invalid_revision = self.get_repository_tip( repository )
self.upload_file( repository,
- 'bismark/bismark_methylation_extractor.xml',
- valid_tools_only=False,
- strings_displayed=[],
- remove_repo_files_not_in_tar='No',
- commit_message='Uploaded an updated tool xml.' )
+ filename='bismark/bismark_methylation_extractor.xml',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded an updated tool xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
valid_revision = self.get_repository_tip( repository )
test_db_util.refresh( repository )
self.check_repository_tools_for_changeset_revision( repository, valid_revision )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0080_advanced_circular_dependencies.py
--- a/test/tool_shed/functional/test_0080_advanced_circular_dependencies.py
+++ b/test/tool_shed/functional/test_0080_advanced_circular_dependencies.py
@@ -26,8 +26,8 @@
admin_user = test_db_util.get_user( common.admin_email )
assert admin_user is not None, 'Problem retrieving user with email %s from the database' % admin_email
admin_user_private_role = test_db_util.get_private_role( admin_user )
- def test_0005_initiate_category_repositories( self ):
- """Create a category for this test suite and add repositories to it."""
+ def test_0005_create_column_repository( self ):
+ """Create and populate the column_maker repository."""
category = self.create_category( name=category_name, description=category_description )
self.logout()
self.login( email=common.test_user_1_email, username=common.test_user_1_name )
@@ -38,9 +38,21 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'column_maker/column_maker.tar',
+ filename='column_maker/column_maker.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded column_maker tarball.',
strings_displayed=[],
- commit_message='Uploaded column_maker.tar.' )
+ strings_not_displayed=[] )
+ def test_0005_create_convert_repository( self ):
+ """Create and populate the convert_chars repository."""
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
+ category = self.create_category( name=category_name, description=category_description )
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
repository = self.get_or_create_repository( name=convert_repository_name,
description=convert_repository_description,
long_description=convert_repository_long_description,
@@ -48,9 +60,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'convert_chars/convert_chars.tar',
+ filename='convert_chars/convert_chars.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded convert_chars tarball.',
strings_displayed=[],
- commit_message='Uploaded convert_chars.tar.' )
+ strings_not_displayed=[] )
def test_0020_create_repository_dependencies( self ):
'''Upload a repository_dependencies.xml file that specifies the current revision of freebayes to the filtering_0040 repository.'''
convert_repository = test_db_util.get_repository_by_name_and_owner( convert_repository_name, common.test_user_1_name )
@@ -60,9 +77,14 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Column maker depends on the convert repository.' )
self.upload_file( column_repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on convert' )
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on convert_chars.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0025_create_dependency_on_filtering( self ):
'''Upload a repository_dependencies.xml file that specifies the current revision of filtering to the freebayes_0040 repository.'''
convert_repository = test_db_util.get_repository_by_name_and_owner( convert_repository_name, common.test_user_1_name )
@@ -72,9 +94,14 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Convert chars depends on the column_maker repository.' )
self.upload_file( convert_repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on column' )
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on column_maker.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0030_verify_repository_dependencies( self ):
'''Verify that each repository can depend on the other without causing an infinite loop.'''
convert_repository = test_db_util.get_repository_by_name_and_owner( convert_repository_name, common.test_user_1_name )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0090_tool_search.py
--- a/test/tool_shed/functional/test_0090_tool_search.py
+++ b/test/tool_shed/functional/test_0090_tool_search.py
@@ -54,9 +54,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'bwa/bwa_base.tar',
+ filename='bwa/bwa_base.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded BWA tarball.',
strings_displayed=[],
- commit_message='Uploaded bwa_base.tar.' )
+ strings_not_displayed=[] )
def test_0010_create_bwa_color_repository( self ):
'''Create and populate bwa_color_0090.'''
category = self.create_category( name=category_name, description=category_description )
@@ -69,9 +74,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'bwa/bwa_color.tar',
+ filename='bwa/bwa_color.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded BWA color tarball.',
strings_displayed=[],
- commit_message='Uploaded bwa_color.tar.' )
+ strings_not_displayed=[] )
def test_0015_create_emboss_datatypes_repository( self ):
'''Create and populate emboss_datatypes_0090.'''
category = self.create_category( name=category_name, description=category_description )
@@ -84,9 +94,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'emboss/datatypes/datatypes_conf.xml',
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
strings_displayed=[],
- commit_message='Uploaded datatypes_conf.xml.' )
+ strings_not_displayed=[] )
def test_0020_create_emboss_repository( self ):
'''Create and populate emboss_0090.'''
category = self.create_category( name=category_name, description=category_description )
@@ -97,18 +112,28 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'emboss/emboss.tar',
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss tarball.',
strings_displayed=[],
- commit_message='Uploaded tool tarball.' )
+ strings_not_displayed=[] )
datatypes_repository = test_db_util.get_repository_by_name_and_owner( emboss_datatypes_repository_name, common.test_user_1_name )
repository_dependencies_path = self.generate_temp_path( 'test_0090', additional_paths=[ 'emboss' ] )
self.generate_repository_dependency_xml( [ datatypes_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Emboss depends on the emboss_datatypes repository.' )
self.upload_file( repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on emboss_datatypes.' )
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on emboss_datatypes.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0025_create_filtering_repository( self ):
'''Create and populate filtering_0090.'''
category = self.create_category( name=category_name, description=category_description )
@@ -119,18 +144,28 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( filtering_repository,
- 'filtering/filtering_1.1.0.tar',
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
strings_displayed=[],
- commit_message='Uploaded filtering.tar.' )
+ strings_not_displayed=[] )
emboss_repository = test_db_util.get_repository_by_name_and_owner( emboss_repository_name, common.test_user_1_name )
repository_dependencies_path = self.generate_temp_path( 'test_0090', additional_paths=[ 'filtering' ] )
self.generate_repository_dependency_xml( [ emboss_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Filtering depends on the emboss repository.' )
self.upload_file( filtering_repository,
- 'repository_dependencies.xml',
- filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on emboss.' )
+ filename='repository_dependencies.xml',
+ filepath=repository_dependencies_path,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on emboss.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0030_create_freebayes_repository( self ):
'''Create and populate freebayes_0090.'''
category = self.create_category( name=category_name, description=category_description )
@@ -141,9 +176,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'freebayes/freebayes.tar',
+ filename='freebayes/freebayes.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded freebayes tarball.',
strings_displayed=[],
- commit_message='Uploaded freebayes.tar.' )
+ strings_not_displayed=[] )
def test_0035_create_and_upload_dependency_definitions( self ):
'''Create and upload repository dependency definitions.'''
bwa_color_repository = test_db_util.get_repository_by_name_and_owner( bwa_color_repository_name, common.test_user_1_name )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0100_complex_repository_dependencies.py
--- a/test/tool_shed/functional/test_0100_complex_repository_dependencies.py
+++ b/test/tool_shed/functional/test_0100_complex_repository_dependencies.py
@@ -38,9 +38,14 @@
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
self.upload_file( repository,
- 'bwa/complex/tool_dependencies.xml',
+ filename='bwa/complex/tool_dependencies.xml',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded tool_dependencies.xml.',
strings_displayed=[ 'Name, version and type from a tool requirement tag does not match' ],
- commit_message='Uploaded tool_dependencies.xml.' )
+ strings_not_displayed=[] )
self.display_manage_repository_page( repository, strings_displayed=[ 'Tool dependencies', 'may not be', 'in this repository' ] )
def test_0010_create_bwa_base_repository( self ):
'''Create and populate bwa_base_0100.'''
@@ -55,9 +60,14 @@
strings_displayed=[] )
tool_repository = test_db_util.get_repository_by_name_and_owner( bwa_tool_repository_name, common.test_user_1_name )
self.upload_file( repository,
- 'bwa/complex/bwa_base.tar',
+ filename='bwa/complex/bwa_base.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded bwa_base.tar with tool wrapper XML, but without tool dependency XML.',
strings_displayed=[],
- commit_message='Uploaded bwa_base.tar with tool wrapper XML, but without tool dependency XML.' )
+ strings_not_displayed=[] )
def test_0015_generate_complex_repository_dependency_invalid_shed_url( self ):
'''Generate and upload a complex repository definition that specifies an invalid tool shed URL.'''
dependency_path = self.generate_temp_path( 'test_0100', additional_paths=[ 'complex', 'invalid' ] )
@@ -71,34 +81,40 @@
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=True, package='bwa', version='0.5.9' )
strings_displayed = [ 'Invalid tool shed <b>%s</b> defined' % url ]
self.upload_file( repository,
- 'tool_dependencies.xml',
+ filename='tool_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
commit_message='Uploaded dependency on bwa_tool_0100 with invalid url.',
- strings_displayed=strings_displayed )
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
def test_0020_generate_complex_repository_dependency_invalid_repository_name( self ):
'''Generate and upload a complex repository definition that specifies an invalid repository name.'''
dependency_path = self.generate_temp_path( 'test_0100', additional_paths=[ 'complex', 'invalid' ] )
xml_filename = self.get_filename( 'tool_dependencies.xml', filepath=dependency_path )
- repository = test_db_util.get_repository_by_name_and_owner( bwa_base_repository_name, common.test_user_1_name )
+ base_repository = test_db_util.get_repository_by_name_and_owner( bwa_base_repository_name, common.test_user_1_name )
tool_repository = test_db_util.get_repository_by_name_and_owner( bwa_tool_repository_name, common.test_user_1_name )
url = self.url
name = 'invalid_repository!?'
owner = tool_repository.user.username
changeset_revision = self.get_repository_tip( tool_repository )
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=True, package='bwa', version='0.5.9' )
- strings_displayed = 'Invalid repository name <b>%s</b> defined.' % name
- self.upload_file( repository,
- 'tool_dependencies.xml',
+ strings_displayed = [ 'Invalid repository name <b>%s</b> defined.' % name ]
+ self.upload_file( base_repository,
+ filename='tool_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
commit_message='Uploaded dependency on bwa_tool_0100 with invalid repository name.',
- strings_displayed=[ strings_displayed ] )
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
def test_0025_generate_complex_repository_dependency_invalid_owner_name( self ):
'''Generate and upload a complex repository definition that specifies an invalid owner.'''
dependency_path = self.generate_temp_path( 'test_0100', additional_paths=[ 'complex', 'invalid' ] )
xml_filename = self.get_filename( 'tool_dependencies.xml', filepath=dependency_path )
- repository = test_db_util.get_repository_by_name_and_owner( bwa_base_repository_name, common.test_user_1_name )
+ base_repository = test_db_util.get_repository_by_name_and_owner( bwa_base_repository_name, common.test_user_1_name )
tool_repository = test_db_util.get_repository_by_name_and_owner( bwa_tool_repository_name, common.test_user_1_name )
url = self.url
name = tool_repository.name
@@ -106,30 +122,36 @@
changeset_revision = self.get_repository_tip( tool_repository )
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=True, package='bwa', version='0.5.9' )
strings_displayed = [ 'Invalid owner <b>%s</b> defined' % owner ]
- self.upload_file( repository,
- 'tool_dependencies.xml',
+ self.upload_file( base_repository,
+ filename='tool_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
commit_message='Uploaded dependency on bwa_tool_0100 with invalid owner.',
- strings_displayed=strings_displayed )
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
def test_0030_generate_complex_repository_dependency_invalid_changeset_revision( self ):
'''Generate and upload a complex repository definition that specifies an invalid changeset revision.'''
dependency_path = self.generate_temp_path( 'test_0100', additional_paths=[ 'complex', 'invalid' ] )
xml_filename = self.get_filename( 'tool_dependencies.xml', filepath=dependency_path )
- repository = test_db_util.get_repository_by_name_and_owner( bwa_base_repository_name, common.test_user_1_name )
+ base_repository = test_db_util.get_repository_by_name_and_owner( bwa_base_repository_name, common.test_user_1_name )
tool_repository = test_db_util.get_repository_by_name_and_owner( bwa_tool_repository_name, common.test_user_1_name )
url = self.url
name = tool_repository.name
owner = tool_repository.user.username
changeset_revision = '1234abcd'
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=True, package='bwa', version='0.5.9' )
- strings_displayed = 'Invalid changeset revision <b>%s</b> defined.' % changeset_revision
- self.upload_file( repository,
- 'tool_dependencies.xml',
+ strings_displayed = [ 'Invalid changeset revision <b>%s</b> defined.' % changeset_revision ]
+ self.upload_file( base_repository,
+ filename='tool_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
commit_message='Uploaded dependency on bwa_tool_0100 with invalid changeset revision.',
- strings_displayed=[ strings_displayed ] )
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
def test_0035_generate_complex_repository_dependency( self ):
'''Generate and upload a valid tool_dependencies.xml file that specifies bwa_tool_repository_0100.'''
base_repository = test_db_util.get_repository_by_name_and_owner( bwa_base_repository_name, common.test_user_1_name )
@@ -142,10 +164,14 @@
changeset_revision = self.get_repository_tip( tool_repository )
self.generate_repository_dependency_xml( [ tool_repository ], xml_filename, complex=True, package='bwa', version='0.5.9' )
self.upload_file( base_repository,
- 'tool_dependencies.xml',
+ filename='tool_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=True,
- filepath=dependency_path,
- commit_message='Uploaded valid complex dependency on bwa_tool_0100.' )
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded valid complex dependency on bwa_tool_0100.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.check_repository_dependency( base_repository, tool_repository )
self.display_manage_repository_page( base_repository, strings_displayed=[ 'bwa', '0.5.9', 'package' ] )
def test_0040_generate_tool_dependency( self ):
@@ -160,10 +186,14 @@
file( xml_filename, 'w' ).write( file( old_tool_dependency, 'r' )
.read().replace( '__PATH__', self.get_filename( 'bwa/complex' ) ) )
self.upload_file( tool_repository,
- xml_filename,
+ filename=xml_filename,
filepath=new_tool_dependency_path,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded new tool_dependencies.xml.',
strings_displayed=[],
- commit_message='Uploaded new tool_dependencies.xml.' )
+ strings_not_displayed=[] )
# Verify that the dependency display has been updated as a result of the new tool_dependencies.xml file.
self.display_manage_repository_page( base_repository,
strings_displayed=[ self.get_repository_tip( tool_repository ), 'bwa', '0.5.9', 'package' ],
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0110_invalid_simple_repository_dependencies.py
--- a/test/tool_shed/functional/test_0110_invalid_simple_repository_dependencies.py
+++ b/test/tool_shed/functional/test_0110_invalid_simple_repository_dependencies.py
@@ -40,7 +40,15 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
- self.upload_file( repository, 'emboss/datatypes/datatypes_conf.xml', commit_message='Uploaded datatypes_conf.xml.' )
+ self.upload_file( repository,
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0015_verify_datatypes_in_datatypes_repository( self ):
'''Verify that the emboss_datatypes repository contains datatype entries.'''
repository = test_db_util.get_repository_by_name_and_owner( datatypes_repository_name, common.test_user_1_name )
@@ -54,7 +62,15 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category.id ),
strings_displayed=[] )
- self.upload_file( repository, 'emboss/emboss.tar', commit_message='Uploaded emboss_5.tar' )
+ self.upload_file( repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss tool tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0025_generate_repository_dependency_with_invalid_url( self ):
'''Generate a repository dependency for emboss 5 with an invalid URL.'''
dependency_path = self.generate_temp_path( 'test_0110', additional_paths=[ 'simple' ] )
@@ -68,11 +84,14 @@
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=False, description='This is invalid.' )
strings_displayed = [ 'Invalid tool shed <b>%s</b> defined for repository <b>%s</b>' % ( url, repository.name ) ]
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
commit_message='Uploaded dependency on emboss_datatypes_0110 with invalid url.',
- strings_displayed=strings_displayed )
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
def test_0030_generate_repository_dependency_with_invalid_name( self ):
'''Generate a repository dependency for emboss 5 with an invalid name.'''
dependency_path = self.generate_temp_path( 'test_0110', additional_paths=[ 'simple' ] )
@@ -86,11 +105,14 @@
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=False, description='This is invalid.' )
strings_displayed = [ 'Invalid repository name <b>%s</b> defined.' % name ]
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
- commit_message='Uploaded dependency on emboss_datatypes_0110 with invalid url.',
- strings_displayed=strings_displayed )
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on emboss_datatypes_0110 with invalid name.',
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
def test_0035_generate_repository_dependency_with_invalid_owner( self ):
'''Generate a repository dependency for emboss 5 with an invalid owner.'''
dependency_path = self.generate_temp_path( 'test_0110', additional_paths=[ 'simple' ] )
@@ -104,11 +126,14 @@
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=False, description='This is invalid.' )
strings_displayed = [ 'Invalid owner <b>%s</b> defined for repository <b>%s</b>' % ( owner, repository.name ) ]
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
- commit_message='Uploaded dependency on emboss_datatypes_0110 with invalid url.',
- strings_displayed=strings_displayed )
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on emboss_datatypes_0110 with invalid owner.',
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
def test_0040_generate_repository_dependency_with_invalid_changeset_revision( self ):
'''Generate a repository dependency for emboss 5 with an invalid changeset revision.'''
dependency_path = self.generate_temp_path( 'test_0110', additional_paths=[ 'simple', 'invalid' ] )
@@ -122,8 +147,11 @@
self.generate_invalid_dependency_xml( xml_filename, url, name, owner, changeset_revision, complex=False, description='This is invalid.' )
strings_displayed = [ 'Invalid changeset revision <b>%s</b> defined.' % changeset_revision ]
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
+ filepath=dependency_path,
valid_tools_only=False,
- filepath=dependency_path,
- commit_message='Uploaded dependency on emboss_datatypes_0110 with invalid url.',
- strings_displayed=strings_displayed )
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded dependency on emboss_datatypes_0110 with invalid changeset revision.',
+ strings_displayed=strings_displayed,
+ strings_not_displayed=[] )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0300_reset_all_metadata.py
--- a/test/tool_shed/functional/test_0300_reset_all_metadata.py
+++ b/test/tool_shed/functional/test_0300_reset_all_metadata.py
@@ -23,15 +23,12 @@
admin_user = test_db_util.get_user( common.admin_email )
assert admin_user is not None, 'Problem retrieving user with email %s from the database' % admin_email
admin_user_private_role = test_db_util.get_private_role( admin_user )
- def test_0005_create_repositories_and_categories( self ):
+ def test_0005_create_filtering_repository( self ):
+ '''Create and populate the filtering_0000 repository.'''
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
category_0000 = self.create_category( name='Test 0000 Basic Repository Features 1', description='Test 0000 Basic Repository Features 1' )
category_0001 = self.create_category( name='Test 0000 Basic Repository Features 2', description='Test 0000 Basic Repository Features 2' )
- category_0010 = self.create_category( name='Test 0010 Repository With Tool Dependencies', description='Tests for a repository with tool dependencies.' )
- category_0020 = self.create_category( name='Test 0020 Basic Repository Dependencies', description='Testing basic repository dependency features.' )
- category_0030 = self.create_category( name='Test 0030 Repository Dependency Revisions', description='Testing repository dependencies by revision.' )
- category_0040 = self.create_category( name='test_0040_repository_circular_dependencies', description='Testing handling of circular repository dependencies.' )
- category_0050 = self.create_category( name='test_0050_repository_n_level_circular_dependencies', description='Testing handling of circular repository dependencies to n levels.' )
- category_0060 = self.create_category( name='Test 0060 Workflow Features', description='Test 0060 - Workflow Features' )
self.logout()
self.login( email=common.test_user_1_email, username=common.test_user_1_name )
repository = self.get_or_create_repository( name='filtering_0000',
@@ -40,34 +37,119 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category_0000.id ) )
if self.repository_is_new( repository ):
- self.upload_file( repository, 'filtering/filtering_1.1.0.tar', commit_message="Uploaded filtering 1.1.0" )
- self.upload_file( repository, 'filtering/filtering_2.2.0.tar', commit_message="Uploaded filtering 2.2.0" )
+ self.upload_file( repository,
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( repository,
+ filename='filtering/filtering_2.2.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 2.2.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0010_create_freebayes_repository( self ):
+ '''Create and populate the freebayes_0010 repository.'''
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
+ category_0010 = self.create_category( name='Test 0010 Repository With Tool Dependencies', description='Tests for a repository with tool dependencies.' )
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
repository = self.get_or_create_repository( name='freebayes_0010',
description="Galaxy's freebayes tool",
long_description="Long description of Galaxy's freebayes tool",
owner=common.test_user_1_name,
- category_id=self.security.encode_id( category_0000.id ),
+ category_id=self.security.encode_id( category_0010.id ),
strings_displayed=[] )
if self.repository_is_new( repository ):
- self.upload_file( repository, 'freebayes/freebayes.xml', valid_tools_only=False, commit_message='Uploaded.', strings_displayed=[] )
- self.upload_file( repository, 'freebayes/tool_data_table_conf.xml.sample', valid_tools_only=False, commit_message='Uploaded.', strings_displayed=[] )
- self.upload_file( repository, 'freebayes/sam_fa_indices.loc.sample', valid_tools_only=False, commit_message='Uploaded.', strings_displayed=[] )
- self.upload_file( repository, 'freebayes/tool_dependencies.xml', commit_message='Uploaded.' )
+ self.upload_file( repository,
+ filename='freebayes/freebayes.xml',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded freebayes.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( repository,
+ filename='freebayes/tool_data_table_conf.xml.sample',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded tool_data_table_conf.xml.sample',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( repository,
+ filename='freebayes/sam_fa_indices.loc.sample',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded sam_fa_indices.loc.sample',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( repository,
+ filename='freebayes/tool_dependencies.xml',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded tool_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0015_create_emboss_repository( self ):
+ '''Create and populate the emboss_0020 repository.'''
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
+ category_0020 = self.create_category( name='Test 0020 Basic Repository Dependencies', description='Testing basic repository dependency features.' )
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
repository = self.get_or_create_repository( name='emboss_datatypes_0020',
description="Galaxy applicable data formats used by Emboss tools.",
long_description="Galaxy applicable data formats used by Emboss tools. This repository contains no tools.",
owner=common.test_user_1_name,
- category_id=self.security.encode_id( category_0010.id ),
+ category_id=self.security.encode_id( category_0020.id ),
strings_displayed=[] )
if self.repository_is_new( repository ):
- self.upload_file( repository, 'emboss/datatypes/datatypes_conf.xml', commit_message='Uploaded datatypes_conf.xml.' )
+ self.upload_file( repository,
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository = self.get_or_create_repository( name='emboss_0020',
description='Galaxy wrappers for Emboss version 5.0.0 tools',
long_description='Galaxy wrappers for Emboss version 5.0.0 tools',
owner=common.test_user_1_name,
category_id=self.security.encode_id( category_0020.id ),
strings_displayed=[] )
- self.upload_file( repository, 'emboss/emboss.tar', commit_message='Uploaded emboss_5.tar' )
+ self.upload_file( repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss.tar',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0020_create_emboss_datatypes_repository( self ):
+ '''Create and populate the emboss_0030 repository.'''
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
+ category_0030 = self.create_category( name='Test 0030 Repository Dependency Revisions', description='Testing repository dependencies by revision.' )
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
datatypes_repository = self.get_or_create_repository( name='emboss_datatypes_0030',
description=datatypes_repository_description,
long_description=datatypes_repository_long_description,
@@ -75,55 +157,114 @@
category_id=self.security.encode_id( category_0030.id ),
strings_displayed=[] )
if self.repository_is_new( datatypes_repository ):
- self.upload_file( datatypes_repository, 'emboss/datatypes/datatypes_conf.xml', commit_message='Uploaded datatypes_conf.xml.' )
+ self.upload_file( datatypes_repository,
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
emboss_5_repository = self.get_or_create_repository( name='emboss_5_0030',
description=emboss_repository_description,
long_description=emboss_repository_long_description,
owner=common.test_user_1_name,
category_id=self.security.encode_id( category_0030.id ),
strings_displayed=[] )
- self.upload_file( emboss_5_repository, 'emboss/emboss.tar', commit_message='Uploaded emboss.tar' )
+ self.upload_file( emboss_5_repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss.tar',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository_dependencies_path = self.generate_temp_path( 'test_0330', additional_paths=[ 'emboss', '5' ] )
self.generate_repository_dependency_xml( [ datatypes_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ) )
self.upload_file( emboss_5_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded repository_dependencies.xml' )
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
emboss_6_repository = self.get_or_create_repository( name='emboss_6_0030',
description=emboss_repository_description,
long_description=emboss_repository_long_description,
owner=common.test_user_1_name,
category_id=self.security.encode_id( category_0030.id ),
strings_displayed=[] )
- self.upload_file( emboss_6_repository, 'emboss/emboss.tar', commit_message='Uploaded emboss.tar' )
+ self.upload_file( emboss_6_repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss.tar',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository_dependencies_path = self.generate_temp_path( 'test_0330', additional_paths=[ 'emboss', '6' ] )
self.generate_repository_dependency_xml( [ datatypes_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ) )
self.upload_file( emboss_6_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded repository_dependencies.xml' )
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
emboss_repository = self.get_or_create_repository( name='emboss_0030',
description=emboss_repository_description,
long_description=emboss_repository_long_description,
owner=common.test_user_1_name,
category_id=self.security.encode_id( category_0030.id ),
strings_displayed=[] )
- self.upload_file( emboss_repository, 'emboss/emboss.tar', commit_message='Uploaded emboss.tar' )
+ self.upload_file( emboss_repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss.tar',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository_dependencies_path = self.generate_temp_path( 'test_0330', additional_paths=[ 'emboss', '5' ] )
self.generate_repository_dependency_xml( [ emboss_5_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ) )
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded repository_dependencies.xml' )
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.generate_repository_dependency_xml( [ emboss_6_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ) )
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded repository_dependencies.xml' )
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0025_create_freebayes_repository( self ):
+ '''Create and populate the freebayes_0040 repository.'''
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
+ category_0040 = self.create_category( name='test_0040_repository_circular_dependencies', description='Testing handling of circular repository dependencies.' )
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
repository = self.get_or_create_repository( name='freebayes_0040',
description="Galaxy's freebayes tool",
long_description="Long description of Galaxy's freebayes tool",
@@ -131,14 +272,30 @@
category_id=self.security.encode_id( category_0040.id ),
strings_displayed=[] )
if self.repository_is_new( repository ):
- self.upload_file( repository, 'freebayes/freebayes.tar', commit_message='Uploaded freebayes.tar.' )
+ self.upload_file( repository,
+ filename='freebayes/freebayes.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded freebayes tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository = self.get_or_create_repository( name='filtering_0040',
description="Galaxy's filtering tool",
long_description="Long description of Galaxy's filtering tool",
owner=common.test_user_1_name,
category_id=self.security.encode_id( category_0040.id ),
strings_displayed=[] )
- self.upload_file( repository, 'filtering/filtering_1.1.0.tar', commit_message='Uploaded filtering.tar.' )
+ self.upload_file( repository,
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
freebayes_repository = test_db_util.get_repository_by_name_and_owner( 'freebayes_0040', common.test_user_1_name )
filtering_repository = test_db_util.get_repository_by_name_and_owner( 'filtering_0040', common.test_user_1_name )
repository_dependencies_path = self.generate_temp_path( 'test_0340', additional_paths=[ 'dependencies' ] )
@@ -146,16 +303,33 @@
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Filtering 1.1.0 depends on the freebayes repository.' )
self.upload_file( filtering_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on freebayes' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml specifying freebayes',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.generate_repository_dependency_xml( [ filtering_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Freebayes depends on the filtering repository.' )
self.upload_file( freebayes_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on filtering' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml specifying filtering',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0030_create_emboss_repository( self ):
+ '''Create and populate the emboss_0050 repository.'''
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
+ category_0050 = self.create_category( name='test_0050_repository_n_level_circular_dependencies', description='Testing handling of circular repository dependencies to n levels.' )
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
datatypes_repository = self.get_or_create_repository( name='emboss_datatypes_0050',
description="Datatypes for emboss",
long_description="Long description of Emboss' datatypes",
@@ -181,41 +355,100 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category_0050.id ),
strings_displayed=[] )
- self.upload_file( datatypes_repository, 'emboss/datatypes/datatypes_conf.xml', commit_message='Uploaded datatypes_conf.xml.' )
- self.upload_file( emboss_repository, 'emboss/emboss.tar', commit_message='Uploaded tool tarball.' )
- self.upload_file( freebayes_repository, 'freebayes/freebayes.tar', commit_message='Uploaded freebayes.tar.' )
- self.upload_file( filtering_repository, 'filtering/filtering_1.1.0.tar', commit_message='Uploaded filtering.tar.' )
+ self.upload_file( datatypes_repository,
+ filename='emboss/datatypes/datatypes_conf.xml',
+ filepath=None,
+ valid_tools_only=False,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded datatypes_conf.xml.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( emboss_repository,
+ filename='emboss/emboss.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded emboss.tar',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( freebayes_repository,
+ filename='freebayes/freebayes.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded freebayes tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( filtering_repository,
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository_dependencies_path = self.generate_temp_path( 'test_0350', additional_paths=[ 'emboss' ] )
self.generate_repository_dependency_xml( [ datatypes_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Emboss depends on the emboss_datatypes repository.' )
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on emboss_datatypes.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository_dependencies_path = self.generate_temp_path( 'test_0350', additional_paths=[ 'filtering' ] )
self.generate_repository_dependency_xml( [ emboss_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Filtering depends on the emboss repository.' )
self.upload_file( filtering_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on emboss.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
repository_dependencies_path = self.generate_temp_path( 'test_0350', additional_paths=[ 'freebayes' ] )
self.generate_repository_dependency_xml( [ filtering_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Emboss depends on the filtering repository.' )
self.upload_file( emboss_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on filtering.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.generate_repository_dependency_xml( [ datatypes_repository, emboss_repository, filtering_repository, freebayes_repository ],
self.get_filename( 'repository_dependencies.xml', filepath=repository_dependencies_path ),
dependency_description='Freebayes depends on the filtering repository.' )
self.upload_file( freebayes_repository,
- 'repository_dependencies.xml',
+ filename='repository_dependencies.xml',
filepath=repository_dependencies_path,
- commit_message='Uploaded dependency on filtering.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded repository_dependencies.xml',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0035_create_filtering_repository( self ):
+ '''Create and populate the filtering_0060 repository.'''
+ self.logout()
+ self.login( email=common.admin_email, username=common.admin_username )
+ category_0060 = self.create_category( name='Test 0060 Workflow Features', description='Test 0060 - Workflow Features' )
+ self.logout()
+ self.login( email=common.test_user_1_email, username=common.test_user_1_name )
workflow_repository = self.get_or_create_repository( name='filtering_0060',
description="Galaxy's filtering tool",
long_description="Long description of Galaxy's filtering tool",
@@ -226,16 +459,28 @@
workflow = file( self.get_filename( 'filtering_workflow/Workflow_for_0060_filter_workflow_repository.ga' ), 'r' ).read()
workflow = workflow.replace( '__TEST_TOOL_SHED_URL__', self.url.replace( 'http://', '' ) )
workflow_filepath = self.generate_temp_path( 'test_0360', additional_paths=[ 'filtering_workflow' ] )
- os.makedirs( workflow_filepath )
+ if not os.path.exists( workflow_filepath ):
+ os.makedirs( workflow_filepath )
file( os.path.join( workflow_filepath, workflow_filename ), 'w+' ).write( workflow )
self.upload_file( workflow_repository,
- workflow_filename,
+ filename=workflow_filename,
filepath=workflow_filepath,
- commit_message='Uploaded filtering workflow.' )
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering workflow.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
self.upload_file( workflow_repository,
- 'filtering/filtering_2.2.0.tar',
- commit_message='Uploaded filtering tool.' )
- def test_0010_reset_metadata_on_all_repositories( self ):
+ filename='filtering/filtering_2.2.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 2.2.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ def test_0040_reset_metadata_on_all_repositories( self ):
'''Reset metadata on all repositories, then verify that it has not changed.'''
self.logout()
self.login( email=common.admin_email, username=common.admin_username )
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_0400_repository_component_reviews.py
--- a/test/tool_shed/functional/test_0400_repository_component_reviews.py
+++ b/test/tool_shed/functional/test_0400_repository_component_reviews.py
@@ -100,7 +100,15 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category.id ),
strings_displayed=strings_displayed )
- self.upload_file( repository, 'filtering/filtering_1.1.0.tar', commit_message="Uploaded filtering 1.1.0" )
+ self.upload_file( repository,
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0020_review_initial_revision_data_types( self ):
'''Review the datatypes component for the current tip revision.'''
"""
@@ -396,7 +404,15 @@
self.login( email=common.test_user_1_email, username=common.test_user_1_name )
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
user = test_db_util.get_user( common.test_user_2_email )
- self.upload_file( repository, 'readme.txt', commit_message="Uploaded readme.txt" )
+ self.upload_file( repository,
+ filename='readme.txt',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded readme.txt.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0095_review_new_changeset_readme_component( self ):
'''Update the filtering repository's readme component review to reflect the presence of the readme file.'''
"""
@@ -452,10 +468,15 @@
"""
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
user = test_db_util.get_user( common.test_user_2_email )
- self.upload_file( repository,
- 'filtering/filtering_test_data.tar',
- commit_message="Uploaded test data.",
- remove_repo_files_not_in_tar='No' )
+ self.upload_file( repository,
+ filename='filtering/filtering_test_data.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0110_review_new_changeset_functional_tests( self ):
'''Update the filtering repository's readme component review to reflect the presence of the readme file.'''
"""
@@ -509,9 +530,14 @@
repository = test_db_util.get_repository_by_name_and_owner( repository_name, common.test_user_1_name )
user = test_db_util.get_user( common.test_user_2_email )
self.upload_file( repository,
- 'filtering/filtering_2.2.0.tar',
- commit_message="Uploaded filtering 2.2.0",
- remove_repo_files_not_in_tar='No' )
+ filename='filtering/filtering_2.2.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 2.2.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0125_review_new_changeset_functional_tests( self ):
'''Update the filtering repository's review to apply to the new changeset with filtering 2.2.0.'''
"""
diff -r dde899d789b698f29269e521d2abc8a1f1373921 -r 3ffe47cea647ae61a02d74bbff68e0e07bff70e4 test/tool_shed/functional/test_1000_install_basic_repository.py
--- a/test/tool_shed/functional/test_1000_install_basic_repository.py
+++ b/test/tool_shed/functional/test_1000_install_basic_repository.py
@@ -32,10 +32,42 @@
owner=common.test_user_1_name,
category_id=self.security.encode_id( category.id ) )
if self.repository_is_new( repository ):
- self.upload_file( repository, 'filtering/filtering_1.1.0.tar', commit_message="Uploaded filtering 1.1.0" )
- self.upload_file( repository, 'filtering/filtering_0000.txt', commit_message="Uploaded readme for 1.1.0", remove_repo_files_not_in_tar='No' )
- self.upload_file( repository, 'filtering/filtering_2.2.0.tar', commit_message="Uploaded filtering 2.2.0", remove_repo_files_not_in_tar='No' )
- self.upload_file( repository, 'readme.txt', commit_message="Uploaded readme for 2.2.0", remove_repo_files_not_in_tar='No' )
+ self.upload_file( repository,
+ filename='filtering/filtering_1.1.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 1.1.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( repository,
+ filename='filtering/filtering_0000.txt',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded readme for 1.1.0',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( repository,
+ filename='filtering/filtering_2.2.0.tar',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=True,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded filtering 2.2.0 tarball.',
+ strings_displayed=[],
+ strings_not_displayed=[] )
+ self.upload_file( repository,
+ filename='readme.txt',
+ filepath=None,
+ valid_tools_only=True,
+ uncompress_file=False,
+ remove_repo_files_not_in_tar=False,
+ commit_message='Uploaded readme for 2.2.0',
+ strings_displayed=[],
+ strings_not_displayed=[] )
def test_0010_browse_tool_sheds( self ):
"""Browse the available tool sheds in this Galaxy instance."""
self.galaxy_logout()
This diff is so big that we needed to truncate the remainder.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Fix yet another import in the Galaxy update manager.
by Bitbucket 13 Feb '13
by Bitbucket 13 Feb '13
13 Feb '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/dde899d789b6/
changeset: dde899d789b6
user: greg
date: 2013-02-13 22:35:32
summary: Fix yet another import in the Galaxy update manager.
affected #: 1 file
diff -r 29c1b1b78ea55bbcff28cd33338e2a765efb2e08 -r dde899d789b698f29269e521d2abc8a1f1373921 lib/galaxy/tool_shed/update_manager.py
--- a/lib/galaxy/tool_shed/update_manager.py
+++ b/lib/galaxy/tool_shed/update_manager.py
@@ -4,6 +4,7 @@
import threading, urllib2, logging
from galaxy.util import string_as_bool
import galaxy.util.shed_util as shed_util
+import galaxy.util.shed_util_common as suc
from galaxy.model.orm import and_
log = logging.getLogger( __name__ )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
13 Feb '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/29c1b1b78ea5/
changeset: 29c1b1b78ea5
user: greg
date: 2013-02-13 20:39:44
summary: Enhance the process for setting metadata on tool shed repositories to better accommodate recently introduced support for the repository dependecny definition and tool dependency definition Galaxy utilities.
affected #: 1 file
diff -r b6241599d1d0bbb4b9ba670bde9fbb421ef6c396 -r 29c1b1b78ea55bbcff28cd33338e2a765efb2e08 lib/galaxy/util/shed_util_common.py
--- a/lib/galaxy/util/shed_util_common.py
+++ b/lib/galaxy/util/shed_util_common.py
@@ -631,38 +631,51 @@
ancestor_tools = ancestor_metadata_dict.get( 'tools', [] )
ancestor_guids = [ tool_dict[ 'guid' ] for tool_dict in ancestor_tools ]
ancestor_guids.sort()
+ ancestor_readme_files = ancestor_metadata_dict.get( 'readme_files', [] )
ancestor_repository_dependencies_dict = ancestor_metadata_dict.get( 'repository_dependencies', {} )
ancestor_repository_dependencies = ancestor_repository_dependencies_dict.get( 'repository_dependencies', [] )
- ancestor_tool_dependencies = ancestor_metadata_dict.get( 'tool_dependencies', [] )
+ ancestor_tool_dependencies = ancestor_metadata_dict.get( 'tool_dependencies', {} )
ancestor_workflows = ancestor_metadata_dict.get( 'workflows', [] )
current_datatypes = current_metadata_dict.get( 'datatypes', [] )
current_tools = current_metadata_dict.get( 'tools', [] )
current_guids = [ tool_dict[ 'guid' ] for tool_dict in current_tools ]
current_guids.sort()
+ current_readme_files = current_metadata_dict.get( 'readme_files', [] )
current_repository_dependencies_dict = current_metadata_dict.get( 'repository_dependencies', {} )
current_repository_dependencies = current_repository_dependencies_dict.get( 'repository_dependencies', [] )
- current_tool_dependencies = current_metadata_dict.get( 'tool_dependencies', [] )
+ current_tool_dependencies = current_metadata_dict.get( 'tool_dependencies', {} )
current_workflows = current_metadata_dict.get( 'workflows', [] )
# Handle case where no metadata exists for either changeset.
no_datatypes = not ancestor_datatypes and not current_datatypes
+ no_readme_files = not ancestor_readme_files and not current_readme_files
no_repository_dependencies = not ancestor_repository_dependencies and not current_repository_dependencies
- # Note: we currently don't need to check tool_dependencies since we're checking for guids - tool_dependencies always require tools (currently).
+ # Tool dependencies can define orphan dependencies in the tool shed.
no_tool_dependencies = not ancestor_tool_dependencies and not current_tool_dependencies
no_tools = not ancestor_guids and not current_guids
no_workflows = not ancestor_workflows and not current_workflows
- if no_datatypes and no_repository_dependencies and no_tool_dependencies and no_tools and no_workflows:
+ if no_datatypes and no_readme_files and no_repository_dependencies and no_tool_dependencies and no_tools and no_workflows:
return 'no metadata'
+ # Uncomment the following if we decide that README files should affect how installable repository revisions are defined. See the NOTE in the
+ # compare_readme_files() method.
+ # readme_file_comparision = compare_readme_files( ancestor_readme_files, current_readme_files )
repository_dependency_comparison = compare_repository_dependencies( ancestor_repository_dependencies, current_repository_dependencies )
+ tool_dependency_comparison = compare_tool_dependencies( ancestor_tool_dependencies, current_tool_dependencies )
workflow_comparison = compare_workflows( ancestor_workflows, current_workflows )
datatype_comparison = compare_datatypes( ancestor_datatypes, current_datatypes )
# Handle case where all metadata is the same.
- if ancestor_guids == current_guids and repository_dependency_comparison == 'equal' and workflow_comparison == 'equal' and datatype_comparison == 'equal':
+ if ancestor_guids == current_guids and \
+ repository_dependency_comparison == 'equal' and \
+ tool_dependency_comparison == 'equal' and \
+ workflow_comparison == 'equal' and \
+ datatype_comparison == 'equal':
return 'equal'
# Handle case where ancestor metadata is a subset of current metadata.
+ # readme_file_is_subset = readme_file_comparision in [ 'equal', 'subset' ]
repository_dependency_is_subset = repository_dependency_comparison in [ 'equal', 'subset' ]
+ tool_dependency_is_subset = tool_dependency_comparison in [ 'equal', 'subset' ]
workflow_dependency_is_subset = workflow_comparison in [ 'equal', 'subset' ]
datatype_is_subset = datatype_comparison in [ 'equal', 'subset' ]
- if repository_dependency_is_subset and workflow_dependency_is_subset and datatype_is_subset:
+ if repository_dependency_is_subset and tool_dependency_is_subset and workflow_dependency_is_subset and datatype_is_subset:
is_subset = True
for guid in ancestor_guids:
if guid not in current_guids:
@@ -694,6 +707,25 @@
else:
return 'subset'
return 'not equal and not subset'
+def compare_readme_files( ancestor_readme_files, current_readme_files ):
+ """Determine if ancestor_readme_files is equal to or a subset of current_readme_files."""
+ # NOTE: Although repository README files are considered a Galaxy utility similar to tools, repository dependency definition files, etc.,
+ # we don't define installable repository revisions based on changes to README files. To understand why, consider the following scenario:
+ # 1. Upload the filtering tool to a new repository - this will result in installable revision 0.
+ # 2. Upload a README file to the repository - this will move the installable revision from revision 0 to revision 1.
+ # 3. Delete the README file from the repository - this will move the installable revision from revision 1 to revision 2.
+ # The above scenario is the current behavior, and that is why this method is not currently called. This method exists only in case we decide
+ # to change this current behavior.
+ # The lists of readme files looks something like: ["database/community_files/000/repo_2/readme.txt"]
+ if len( ancestor_readme_files ) <= len( current_readme_files ):
+ for ancestor_readme_file in ancestor_readme_files:
+ if ancestor_readme_file not in current_readme_files:
+ return 'not equal and not subset'
+ if len( ancestor_readme_files ) == len( current_readme_files ):
+ return 'equal'
+ else:
+ return 'subset'
+ return 'not equal and not subset'
def compare_repository_dependencies( ancestor_repository_dependencies, current_repository_dependencies ):
"""Determine if ancestor_repository_dependencies is the same as or a subset of current_repository_dependencies."""
# The list of repository_dependencies looks something like: [["http://localhost:9009", "emboss_datatypes", "test", "ab03a2a5f407"]].
@@ -717,6 +749,25 @@
else:
return 'subset'
return 'not equal and not subset'
+def compare_tool_dependencies( ancestor_tool_dependencies, current_tool_dependencies ):
+ """Determine if ancestor_tool_dependencies is the same as or a subset of current_tool_dependencies."""
+ # The tool_dependencies dictionary looks something like:
+ # {'bwa/0.5.9': {'readme': 'some string', 'version': '0.5.9', 'type': 'package', 'name': 'bwa'}}
+ if len( ancestor_tool_dependencies ) <= len( current_tool_dependencies ):
+ for ancestor_td_key, ancestor_requirements_dict in ancestor_tool_dependencies.items():
+ if ancestor_td_key in current_tool_dependencies:
+ # The only values that could have changed between the 2 dictionaries are the "readme" or "type" values. Changing the readme value
+ # makes no difference. Changing the type will change the installation process, but for now we'll assume it was a typo, so new metadata
+ # shouldn't be generated.
+ continue
+ else:
+ return 'not equal and not subset'
+ # At this point we know that ancestor_tool_dependencies is at least a subset of current_tool_dependencies.
+ if len( ancestor_tool_dependencies ) == len( current_tool_dependencies ):
+ return 'equal'
+ else:
+ return 'subset'
+ return 'not equal and not subset'
def compare_workflows( ancestor_workflows, current_workflows ):
"""Determine if ancestor_workflows is the same as current_workflows or if ancestor_workflows is a subset of current_workflows."""
if len( ancestor_workflows ) <= len( current_workflows ):
@@ -730,9 +781,7 @@
found_in_current = False
for current_workflow_tup in current_workflows:
current_workflow_dict = current_workflow_tup[1]
- # Assume that if the name and number of steps are euqal,
- # then the workflows are the same. Of course, this may
- # not be true...
+ # Assume that if the name and number of steps are euqal, then the workflows are the same. Of course, this may not be true...
if current_workflow_dict[ 'name' ] == ancestor_workflow_name and len( current_workflow_dict[ 'steps' ] ) == num_ancestor_workflow_steps:
found_in_current = True
break
@@ -1070,7 +1119,11 @@
else:
original_repository_metadata = None
readme_file_names = get_readme_file_names( repository.name )
- metadata_dict = { 'shed_config_filename' : shed_config_dict.get( 'config_filename' ) }
+ if app.name == 'galaxy':
+ # Shed related tool panel configs are only relevant to Galaxy.
+ metadata_dict = { 'shed_config_filename' : shed_config_dict.get( 'config_filename' ) }
+ else:
+ metadata_dict = {}
readme_files = []
invalid_file_tups = []
invalid_tool_configs = []
@@ -2862,42 +2915,158 @@
containers_dict[ 'tool_dependencies' ] = root_container
containers_dict[ 'missing_tool_dependencies' ] = None
return containers_dict
-def new_repository_dependency_metadata_required( trans, repository, metadata_dict ):
+def new_datatypes_metadata_required( trans, repository_metadata, metadata_dict ):
"""
- Compare the last saved metadata for each repository dependency in the repository with the new metadata in metadata_dict to determine if a new
- repository_metadata table record is required or if the last saved metadata record can be updated instead.
+ Compare the last saved metadata for each datatype in the repository with the new metadata in metadata_dict to determine if a new
+ repository_metadata table record is required or if the last saved metadata record can be updated for datatypes instead.
"""
- if 'repository_dependencies' in metadata_dict:
- repository_metadata = get_latest_repository_metadata( trans, repository.id )
+ # Datatypes are stored in metadata as a list of dictionaries that looks like:
+ # [{'dtype': 'galaxy.datatypes.data:Text', 'subclass': 'True', 'extension': 'acedb'}]
+ if 'datatypes' in metadata_dict:
+ current_datatypes = metadata_dict[ 'datatypes' ]
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
- if 'repository_dependencies' in metadata:
- saved_repository_dependencies = metadata[ 'repository_dependencies' ][ 'repository_dependencies' ]
- new_repository_dependencies = metadata_dict[ 'repository_dependencies' ][ 'repository_dependencies' ]
+ if 'datatypes' in metadata:
+ ancestor_datatypes = metadata[ 'datatypes' ]
# The saved metadata must be a subset of the new metadata.
- for new_repository_dependency_metadata in new_repository_dependencies:
- if new_repository_dependency_metadata not in saved_repository_dependencies:
- return True
- for saved_repository_dependency_metadata in saved_repository_dependencies:
- if saved_repository_dependency_metadata not in new_repository_dependencies:
- return True
+ datatype_comparison = compare_datatypes( ancestor_datatypes, current_datatypes )
+ if datatype_comparison == 'not equal and not subset':
+ return True
+ else:
+ return False
+ else:
+ # The new metadata includes datatypes, but the stored metadata does not, so we can update the stored metadata.
+ return False
else:
- # We have repository metadata that does not include metadata for any repository dependencies in the
- # repository, so we can update the existing repository metadata.
+ # There is no stored metadata, so we can update the metadata column in the repository_metadata table.
return False
else:
- # There is no saved repository metadata, so we need to create a new repository_metadata table record.
+ # There is no stored repository metadata, so we need to create a new repository_metadata table record.
return True
- # The received metadata_dict includes no metadata for repository dependencies, so a new repository_metadata table record is not needed.
+ # The received metadata_dict includes no metadata for datatypes, so a new repository_metadata table record is not needed.
return False
-def new_tool_metadata_required( trans, repository, metadata_dict ):
+def new_metadata_required_for_utilities( trans, repository, new_tip_metadata_dict ):
+ """
+ Galaxy utilities currently consist of datatypes, repository_dependency definitions, tools, tool_dependency definitions and exported
+ Galaxy workflows. This method compares the last stored repository_metadata record associated with the received repository against the
+ contents of the received new_tip_metadata_dict and returns True or False for the union set of Galaxy utilities contained in both metadata
+ dictionaries. The metadata contained in new_tip_metadata_dict may not be a subset of that contained in the last stored repository_metadata
+ record associated with the received repository because one or more Galaxy utilities may have been deleted from the repository in the new tip.
+ """
+ repository_metadata = get_latest_repository_metadata( trans, repository.id )
+ datatypes_required = new_datatypes_metadata_required( trans, repository_metadata, new_tip_metadata_dict )
+ # Uncomment the following if we decide that README files should affect how installable repository revisions are defined. See the NOTE in the
+ # compare_readme_files() method.
+ # readme_files_required = new_readme_files_metadata_required( trans, repository_metadata, new_tip_metadata_dict )
+ repository_dependencies_required = new_repository_dependency_metadata_required( trans, repository_metadata, new_tip_metadata_dict )
+ tools_required = new_tool_metadata_required( trans, repository_metadata, new_tip_metadata_dict )
+ tool_dependencies_required = new_tool_dependency_metadata_required( trans, repository_metadata, new_tip_metadata_dict )
+ workflows_required = new_workflow_metadata_required( trans, repository_metadata, new_tip_metadata_dict )
+ if datatypes_required or repository_dependencies_required or tools_required or tool_dependencies_required or workflows_required:
+ return True
+ return False
+def new_readme_files_metadata_required( trans, repository_metadata, metadata_dict ):
+ """
+ Compare the last saved metadata for each readme file in the repository with the new metadata in metadata_dict to determine if a new
+ repository_metadata table record is required or if the last saved metadata record can be updated for readme files instead.
+ """
+ # Repository README files are kind of a special case because they have no effect on reproducibility. We'll simply inspect the file names to
+ # determine if any that exist in the saved metadata are eliminated from the new metadata in the received metadata_dict.
+ if 'readme_files' in metadata_dict:
+ current_readme_files = metadata_dict[ 'readme_files' ]
+ if repository_metadata:
+ metadata = repository_metadata.metadata
+ if metadata:
+ if 'readme_files' in metadata:
+ ancestor_readme_files = metadata[ 'readme_files' ]
+ # The saved metadata must be a subset of the new metadata.
+ readme_file_comparison = compare_readme_files( ancestor_readme_files, current_readme_files )
+ if readme_file_comparison == 'not equal and not subset':
+ return True
+ else:
+ return False
+ else:
+ # The new metadata includes readme_files, but the stored metadata does not, so we can update the stored metadata.
+ return False
+ else:
+ # There is no stored metadata, so we can update the metadata column in the repository_metadata table.
+ return False
+ else:
+ # There is no stored repository metadata, so we need to create a new repository_metadata table record.
+ return True
+ # The received metadata_dict includes no metadata for readme_files, so a new repository_metadata table record is not needed.
+ return False
+def new_repository_dependency_metadata_required( trans, repository_metadata, metadata_dict ):
+ """
+ Compare the last saved metadata for each repository dependency in the repository with the new metadata in metadata_dict to determine if a new
+ repository_metadata table record is required or if the last saved metadata record can be updated for repository_dependencies instead.
+ """
+ if repository_metadata:
+ metadata = repository_metadata.metadata
+ if 'repository_dependencies' in metadata:
+ saved_repository_dependencies = metadata[ 'repository_dependencies' ][ 'repository_dependencies' ]
+ new_repository_dependencies_metadata = metadata_dict.get( 'repository_dependencies', None )
+ if new_repository_dependencies_metadata:
+ new_repository_dependencies = metadata_dict[ 'repository_dependencies' ][ 'repository_dependencies' ]
+ # The saved metadata must be a subset of the new metadata.
+ for new_repository_dependency_metadata in new_repository_dependencies:
+ if new_repository_dependency_metadata not in saved_repository_dependencies:
+ return True
+ for saved_repository_dependency_metadata in saved_repository_dependencies:
+ if saved_repository_dependency_metadata not in new_repository_dependencies:
+ return True
+ else:
+ # The repository_dependencies.xml file must have been deleted, so create a new repository_metadata record so we always have
+ # access to the deleted file.
+ return True
+ else:
+ if 'repository_dependencies' in metadata_dict:
+ # There is no saved repository metadata, so we need to create a new repository_metadata record.
+ return True
+ else:
+ # The received metadata_dict includes no metadata for repository dependencies, so a new repository_metadata record is not needed.
+ return False
+def new_tool_dependency_metadata_required( trans, repository_metadata, metadata_dict ):
+ """
+ Compare the last saved metadata for each tool dependency in the repository with the new metadata in metadata_dict to determine if a new
+ repository_metadata table record is required or if the last saved metadata record can be updated for tool_dependencies instead.
+ """
+ if repository_metadata:
+ metadata = repository_metadata.metadata
+ if metadata:
+ if 'tool_dependencies' in metadata:
+ saved_tool_dependencies = metadata[ 'tool_dependencies' ]
+ new_tool_dependencies = metadata_dict.get( 'tool_dependencies', None )
+ if new_tool_dependencies:
+ # The saved metadata must be a subset of the new metadata.
+ for new_repository_dependency_metadata in new_tool_dependencies:
+ if new_repository_dependency_metadata not in saved_tool_dependencies:
+ return True
+ for saved_repository_dependency_metadata in saved_tool_dependencies:
+ if saved_repository_dependency_metadata not in new_tool_dependencies:
+ return True
+ else:
+ # The tool_dependencies.xml file must have been deleted, so create a new repository_metadata record so we always have
+ # access to the deleted file.
+ return True
+ else:
+ # We have repository metadata that does not include metadata for any tool dependencies in the repository, so we can update
+ # the existing repository metadata.
+ return False
+ else:
+ if 'tool_dependencies' in metadata_dict:
+ # There is no saved repository metadata, so we need to create a new repository_metadata record.
+ return True
+ else:
+ # The received metadata_dict includes no metadata for tool dependencies, so a new repository_metadata record is not needed.
+ return False
+def new_tool_metadata_required( trans, repository_metadata, metadata_dict ):
"""
Compare the last saved metadata for each tool in the repository with the new metadata in metadata_dict to determine if a new repository_metadata
table record is required, or if the last saved metadata record can be updated instead.
"""
if 'tools' in metadata_dict:
- repository_metadata = get_latest_repository_metadata( trans, repository.id )
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
@@ -2921,22 +3090,23 @@
for new_tool_metadata_dict in metadata_dict[ 'tools' ]:
if new_tool_metadata_dict[ 'id' ] not in saved_tool_ids:
return True
+ else:
+ # The new metadata includes tools, but the stored metadata does not, so we can update the stored metadata.
+ return False
else:
- # We have repository metadata that does not include metadata for any tools in the
- # repository, so we can update the existing repository metadata.
+ # There is no stored metadata, so we can update the metadata column in the repository_metadata table.
return False
else:
- # There is no saved repository metadata, so we need to create a new repository_metadata table record.
+ # There is no stored repository metadata, so we need to create a new repository_metadata table record.
return True
# The received metadata_dict includes no metadata for tools, so a new repository_metadata table record is not needed.
return False
-def new_workflow_metadata_required( trans, repository, metadata_dict ):
+def new_workflow_metadata_required( trans, repository_metadata, metadata_dict ):
"""
Currently everything about an exported workflow except the name is hard-coded, so there's no real way to differentiate versions of
exported workflows. If this changes at some future time, this method should be enhanced accordingly.
"""
if 'workflows' in metadata_dict:
- repository_metadata = get_latest_repository_metadata( trans, repository.id )
if repository_metadata:
# The repository has metadata, so update the workflows value - no new record is needed.
return False
@@ -3206,9 +3376,9 @@
persist=False )
# We'll only display error messages for the repository tip (it may be better to display error messages for each installable changeset revision).
if current_changeset_revision == repository.tip( trans.app ):
- invalid_file_tups.extend( invalid_tups )
+ invalid_file_tups.extend( invalid_tups )
if current_metadata_dict:
- if not metadata_changeset_revision and not metadata_dict:
+ if metadata_changeset_revision is None and metadata_dict is None:
# We're at the first change set in the change log.
metadata_changeset_revision = current_changeset_revision
metadata_dict = current_metadata_dict
@@ -3344,17 +3514,15 @@
updating_installed_repository=False,
persist=False )
if metadata_dict:
- downloadable = is_downloadable( metadata_dict )
repository_metadata = None
- if new_repository_dependency_metadata_required( trans, repository, metadata_dict ) or \
- new_tool_metadata_required( trans, repository, metadata_dict ) or \
- new_workflow_metadata_required( trans, repository, metadata_dict ):
+ if new_metadata_required_for_utilities( trans, repository, metadata_dict ):
# Create a new repository_metadata table row.
repository_metadata = create_or_update_repository_metadata( trans, encoded_id, repository, repository.tip( trans.app ), metadata_dict )
# If this is the first record stored for this repository, see if we need to send any email alerts.
if len( repository.downloadable_revisions ) == 1:
handle_email_alerts( trans, repository, content_alert_str='', new_repo_alert=True, admin_only=False )
else:
+ # Update the latest stored repository metadata with the contents and attributes of metadata_dict.
repository_metadata = get_latest_repository_metadata( trans, repository.id )
if repository_metadata:
downloadable = is_downloadable( metadata_dict )
@@ -3365,7 +3533,7 @@
trans.sa_session.add( repository_metadata )
trans.sa_session.flush()
else:
- # There are no tools in the repository, and we're setting metadata on the repository tip.
+ # There are no metadata records associated with the repository.
repository_metadata = create_or_update_repository_metadata( trans, encoded_id, repository, repository.tip( trans.app ), metadata_dict )
if 'tools' in metadata_dict and repository_metadata and status != 'error':
# Set tool versions on the new downloadable change set. The order of the list of changesets is critical, so we use the repo's changelog.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/499ce7fb3821/
changeset: 499ce7fb3821
branch: bugfixes
user: dannon
date: 2013-02-13 19:28:47
summary: Close branch bugfixes from PR#120
affected #: 0 files
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/b6241599d1d0/
changeset: b6241599d1d0
user: jgoecks
date: 2013-02-13 19:19:55
summary: Python 2.5 compatibility fix.
affected #: 1 file
diff -r a6710d64e2d8081e07050696e377c5f864dd1f9f -r b6241599d1d0bbb4b9ba670bde9fbb421ef6c396 lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -244,7 +244,7 @@
separator_char = (lambda c: '<TAB>' if c == '\t' else c)(self.separator)
rval = []
- for i, line in enumerate( reader, start=1 ):
+ for i, line in enumerate( reader ):
if line.lstrip().startswith( self.comment_char ):
continue
line = line.rstrip( "\n\r" )
@@ -255,7 +255,7 @@
else:
log.warn( "Line %i in tool data table '%s' is invalid (HINT: "
"'%s' characters must be used to separate fields):\n%s"
- % ( i, self.name, separator_char, line ) )
+ % ( ( i + 1 ), self.name, separator_char, line ) )
return rval
def get_entry( self, query_attr, query_val, return_attr ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/c8b7dfb1e76d/
changeset: c8b7dfb1e76d
branch: bugfixes
user: Bjoern Gruening
date: 2013-02-09 16:48:20
summary: Display the toolshed tools in the workflow search.
affected #: 1 file
diff -r 506484344db3a370f8ae24096041d38557d1967e -r c8b7dfb1e76d45532a339592f72c57918fbe6ba6 templates/webapps/galaxy/workflow/editor.mako
--- a/templates/webapps/galaxy/workflow/editor.mako
+++ b/templates/webapps/galaxy/workflow/editor.mako
@@ -93,12 +93,12 @@
$(".toolSectionWrapper").find(".toolTitle").hide();
if ( data.length != 0 ) {
// Map tool ids to element ids and join them.
- var s = $.map( data, function( n, i ) { return "#link-" + n; } ).join( ", " );
+ var s = $.map( data, function( n, i ) { return "link-" + n; } );
// First pass to show matching tools and their parents.
- $(s).each( function() {
+ $(s).each( function(index,id) {
// Add class to denote match.
- $(this).parent().addClass("search_match");
- $(this).parent().show().parent().parent().show().parent().show();
+ $("[id='"+id+"']").parent().addClass("search_match");
+ $("[id='"+id+"']").parent().show().parent().parent().show().parent().show();
});
// Hide labels that have no visible children.
$(".toolPanelLabel").each( function() {
https://bitbucket.org/galaxy/galaxy-central/commits/a6710d64e2d8/
changeset: a6710d64e2d8
user: dannon
date: 2013-02-13 19:15:00
summary: Merged in BjoernGruening/galaxy-central-bgruening/bugfixes (pull request #120)
Display the toolshed tools in the workflow search.
affected #: 1 file
diff -r 0ea9f5196fde94701cf2ec8404e9e33e53c51d0e -r a6710d64e2d8081e07050696e377c5f864dd1f9f templates/webapps/galaxy/workflow/editor.mako
--- a/templates/webapps/galaxy/workflow/editor.mako
+++ b/templates/webapps/galaxy/workflow/editor.mako
@@ -93,12 +93,12 @@
$(".toolSectionWrapper").find(".toolTitle").hide();
if ( data.length != 0 ) {
// Map tool ids to element ids and join them.
- var s = $.map( data, function( n, i ) { return "#link-" + n; } ).join( ", " );
+ var s = $.map( data, function( n, i ) { return "link-" + n; } );
// First pass to show matching tools and their parents.
- $(s).each( function() {
+ $(s).each( function(index,id) {
// Add class to denote match.
- $(this).parent().addClass("search_match");
- $(this).parent().show().parent().parent().show().parent().show();
+ $("[id='"+id+"']").parent().addClass("search_match");
+ $("[id='"+id+"']").parent().show().parent().parent().show().parent().show();
});
// Hide labels that have no visible children.
$(".toolPanelLabel").each( function() {
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0