galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
January 2012
- 1 participants
- 95 discussions
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/8a440d85e13b/
changeset: 8a440d85e13b
user: dan
date: 2012-01-26 17:09:59
summary: Update samtools flag stat test data.
affected #: 1 file
diff -r 8d486ca45a224bf20cca622b0c0f06f72c98ef3c -r 8a440d85e13b83697220a2da4bb6f84d58027b60 test-data/samtools_flagstat_out1.txt
--- a/test-data/samtools_flagstat_out1.txt
+++ b/test-data/samtools_flagstat_out1.txt
@@ -1,12 +1,11 @@
-10 in total
-0 QC failure
-0 duplicates
-10 mapped (100.00%)
-0 paired in sequencing
-0 read1
-0 read2
-0 properly paired (nan%)
-0 with itself and mate mapped
-0 singletons (nan%)
-0 with mate mapped to a different chr
-0 with mate mapped to a different chr (mapQ>=5)
+10 + 0 in total (QC-passed reads + QC-failed reads)
+0 + 0 duplicates
+10 + 0 mapped (100.00%:nan%)
+0 + 0 paired in sequencing
+0 + 0 read1
+0 + 0 read2
+0 + 0 properly paired (nan%:nan%)
+0 + 0 with itself and mate mapped
+0 + 0 singletons (nan%:nan%)
+0 + 0 with mate mapped to a different chr
+0 + 0 with mate mapped to a different chr (mapQ>=5)
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Trackster: fix bugs to (a) clear reference track when changing chromosomes and (b) to only show differences if reference data is available.
by Bitbucket 26 Jan '12
by Bitbucket 26 Jan '12
26 Jan '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/8d486ca45a22/
changeset: 8d486ca45a22
user: jgoecks
date: 2012-01-26 15:02:33
summary: Trackster: fix bugs to (a) clear reference track when changing chromosomes and (b) to only show differences if reference data is available.
affected #: 1 file
diff -r a741ad5ebd910818153c0a4fdb4331bd76da834f -r 8d486ca45a224bf20cca622b0c0f06f72c98ef3c static/scripts/trackster.js
--- a/static/scripts/trackster.js
+++ b/static/scripts/trackster.js
@@ -1549,6 +1549,7 @@
drawable.init();
}
}
+ view.reference_track.init();
}
if (low !== undefined && high !== undefined) {
view.low = Math.max(low, 0);
@@ -3876,8 +3877,9 @@
extend(ReferenceTrack.prototype, Drawable.prototype, TiledTrack.prototype, {
build_header_div: function() {},
init: function() {
+ this.data_manager.clear();
// Enable by default because there should always be data when drawing track.
- this.enabled = true;
+ this.enabled = true;
},
can_draw: Drawable.prototype.can_draw,
/**
@@ -5444,9 +5446,15 @@
// TODO: this can be made much more efficient by computing the complete sequence
// to draw and then drawing it.
for (var c = 0, str_len = seq.length; c < str_len; c++) {
- if (this.prefs.show_differences && this.ref_seq) {
- var ref_char = this.ref_seq[seq_start - tile_low + c];
- if (!ref_char || ref_char.toLowerCase() === seq[c].toLowerCase()) {
+ if (this.prefs.show_differences) {
+ if (this.ref_seq) {
+ var ref_char = this.ref_seq[seq_start - tile_low + c];
+ if (!ref_char || ref_char.toLowerCase() === seq[c].toLowerCase()) {
+ continue;
+ }
+ }
+ else {
+ // No reference so cannot show differences.
continue;
}
}
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Add Perform Logistic Regression with vif, Compute partial R square, and Assign weighted-average to tool_conf.xml.main.
by Bitbucket 25 Jan '12
by Bitbucket 25 Jan '12
25 Jan '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/a741ad5ebd91/
changeset: a741ad5ebd91
user: dan
date: 2012-01-25 18:56:09
summary: Add Perform Logistic Regression with vif, Compute partial R square, and Assign weighted-average to tool_conf.xml.main.
affected #: 1 file
diff -r 26920e20157fccbccb7219b48db3e8e57a8e310c -r a741ad5ebd910818153c0a4fdb4331bd76da834f tool_conf.xml.main
--- a/tool_conf.xml.main
+++ b/tool_conf.xml.main
@@ -148,6 +148,7 @@
<section name="Regional Variation" id="regVar"><tool file="regVariation/windowSplitter.xml" /><tool file="regVariation/featureCounter.xml" />
+ <tool file="regVariation/WeightedAverage.xml" /><tool file="regVariation/quality_filter.xml" /><tool file="regVariation/maf_cpg_filter.xml" /><tool file="regVariation/getIndels_2way.xml" />
@@ -160,8 +161,10 @@
</section><section name="Multiple regression" id="multReg"><tool file="regVariation/linear_regression.xml" />
+ <tool file="regVariation/logistic_regression_vif.xml" /><tool file="regVariation/best_regression_subsets.xml" /><tool file="regVariation/rcve.xml" />
+ <tool file="regVariation/partialR_square.xml" /></section><section name="Multivariate Analysis" id="multVar"><tool file="multivariate_stats/pca.xml" />
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: jgoecks: Improve language in 'Import workflow' dialog.
by Bitbucket 25 Jan '12
by Bitbucket 25 Jan '12
25 Jan '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/26920e20157f/
changeset: 26920e20157f
user: jgoecks
date: 2012-01-25 18:20:50
summary: Improve language in 'Import workflow' dialog.
affected #: 1 file
diff -r c60760713d26558d6733367709433d8eefbd64ca -r 26920e20157fccbccb7219b48db3e8e57a8e310c templates/workflow/import.mako
--- a/templates/workflow/import.mako
+++ b/templates/workflow/import.mako
@@ -26,24 +26,24 @@
${render_msg( message, status )}
%endif
<div class="toolForm">
- <div class="toolFormTitle">Import an exported Galaxy workflow file</div>
+ <div class="toolFormTitle">Import Galaxy workflow</div><div class="toolFormBody"><form name="import_workflow" id="import_workflow" action="${h.url_for( controller='workflow', action='import_workflow' )}" enctype="multipart/form-data" method="POST"><div class="form-row">
- <label>URL for exported Galaxy workflow:</label>
+ <label>Galaxy workflow URL:</label><input type="text" name="url" value="${url}" size="40"><div class="toolParamHelp" style="clear: both;">
- If the workflow is accessible via an URL, enter the URL above and click the <b>Import</b> button.
+ If the workflow is accessible via a URL, enter the URL above and click <b>Import</b>.
</div><div style="clear: both"></div></div><div class="form-row">
- <label>Exported Galaxy workflow file:</label>
+ <label>Galaxy workflow file:</label><div class="form-row-input"><input type="file" name="file_data"/></div><div class="toolParamHelp" style="clear: both;">
- If the workflow is stored locally in a file, browse and select it and then click the <b>Import</b> button.
+ If the workflow is in a file on your computer, choose it and then click <b>Import</b>.
</div><div style="clear: both"></div></div>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dannon: Fix for Input Dataset datatype filtering regression introduced in 6329:ab90893a7cf5.
by Bitbucket 25 Jan '12
by Bitbucket 25 Jan '12
25 Jan '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/c60760713d26/
changeset: c60760713d26
user: dannon
date: 2012-01-25 18:12:05
summary: Fix for Input Dataset datatype filtering regression introduced in 6329:ab90893a7cf5.
Input Dataset steps will now again properly filter appropriate input datasets based on subsequent tool input types.
affected #: 2 files
diff -r f9da9ca5ccae7ead780d0331102cda204e38d646 -r c60760713d26558d6733367709433d8eefbd64ca lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -486,7 +486,7 @@
return value
def get_initial_value( self, trans, context ):
return None
-
+
class HiddenToolParameter( ToolParameter ):
"""
Parameter that takes one of two values.
@@ -816,7 +816,7 @@
for t, v, s in options:
if v in value:
rval.append( t )
- return "\n".join( rval ) + suffix
+ return "\n".join( rval ) + suffix
def get_dependencies( self ):
"""
Get the *names* of the other params this param depends on.
@@ -1326,7 +1326,7 @@
Nate's next pass at the dataset security stuff will dramatically alter this anyway.
"""
- def __init__( self, tool, elem ):
+ def __init__( self, tool, elem, trans=None):
ToolParameter.__init__( self, tool, elem )
# Add metadata validator
if not string_as_bool( elem.get( 'no_validation', False ) ):
@@ -1337,11 +1337,16 @@
for extension in self.extensions:
extension = extension.strip()
if tool is None:
- #This occurs for things such as unit tests
- import galaxy.datatypes.registry
- datatypes_registry = galaxy.datatypes.registry.Registry()
- datatypes_registry.load_datatypes()
- formats.append( datatypes_registry.get_datatype_by_extension( extension.lower() ).__class__ )
+ if trans:
+ # Must account for "Input Dataset" types, which while not a tool still need access to the real registry.
+ # A handle to the transaction (and thus app) will be given by the module.
+ formats.append( trans.app.datatypes_registry.get_datatype_by_extension( extension.lower() ).__class__ )
+ else:
+ #This occurs for things such as unit tests
+ import galaxy.datatypes.registry
+ datatypes_registry = galaxy.datatypes.registry.Registry()
+ datatypes_registry.load_datatypes()
+ formats.append( datatypes_registry.get_datatype_by_extension( extension.lower() ).__class__ )
else:
formats.append( tool.app.datatypes_registry.get_datatype_by_extension( extension.lower() ).__class__ )
self.formats = tuple( formats )
diff -r f9da9ca5ccae7ead780d0331102cda204e38d646 -r c60760713d26558d6733367709433d8eefbd64ca lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -134,7 +134,7 @@
def get_runtime_inputs( self, filter_set=['data'] ):
label = self.state.get( "name", "Input Dataset" )
- return dict( input=DataToolParameter( None, Element( "param", name="input", label=label, multiple=True, type="data", format=', '.join(filter_set) ) ) )
+ return dict( input=DataToolParameter( None, Element( "param", name="input", label=label, multiple=True, type="data", format=', '.join(filter_set) ), self.trans ) )
def get_runtime_state( self ):
state = DefaultToolState()
state.inputs = dict( input=None )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Make additional files output by SamToFastq when using read groups be fastqsanger format instead of just fastq.
by Bitbucket 25 Jan '12
by Bitbucket 25 Jan '12
25 Jan '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/f9da9ca5ccae/
changeset: f9da9ca5ccae
user: dan
date: 2012-01-25 18:11:27
summary: Make additional files output by SamToFastq when using read groups be fastqsanger format instead of just fastq.
affected #: 1 file
diff -r fd6f22c072f3b2acbde3311943af004f1094a4fa -r f9da9ca5ccae7ead780d0331102cda204e38d646 tools/picard/picard_SamToFastq_wrapper.py
--- a/tools/picard/picard_SamToFastq_wrapper.py
+++ b/tools/picard/picard_SamToFastq_wrapper.py
@@ -80,13 +80,13 @@
shutil.move( os.path.join( tmp_dir, filename ), fastq_1_name )
fastq_1_name = None
else:
- shutil.move( os.path.join( tmp_dir, filename ), os.path.join( options.new_files_path, 'primary_%s_%s - 1_visible_fastq' % ( file_id_1, filename[:-len( '_1.fastq' )] ) ) )
+ shutil.move( os.path.join( tmp_dir, filename ), os.path.join( options.new_files_path, 'primary_%s_%s - 1_visible_fastqsanger' % ( file_id_1, filename[:-len( '_1.fastq' )] ) ) )
elif filename.endswith( '_2.fastq' ):
if fastq_2_name:
shutil.move( os.path.join( tmp_dir, filename ), fastq_2_name )
fastq_2_name = None
else:
- shutil.move( os.path.join( tmp_dir, filename ), os.path.join( options.new_files_path, 'primary_%s_%s - 2_visible_fastq' % ( file_id_2, filename[:-len( '_2.fastq' )] ) ) )
+ shutil.move( os.path.join( tmp_dir, filename ), os.path.join( options.new_files_path, 'primary_%s_%s - 2_visible_fastqsanger' % ( file_id_2, filename[:-len( '_2.fastq' )] ) ) )
cleanup_before_exit( tmp_dir )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/fd6f22c072f3/
changeset: fd6f22c072f3
user: greg
date: 2012-01-25 17:17:04
summary: In addition to deactivating / reactivating installed tool shed repositories, add the ability to uninstall / reinstall installed tool shed repositories. Uninstalling and reinstalling will affect the repository on disk as well as the tool config files in which the tool shed repostory's tools are defined. Reinstalling a tool shed repository will currently load it's tools into the same location in the tool panel in which the tools were originally located prior to uninstalling the repository. Installing / uninstalling / deactivating / activating work for tools sheds that were installed either manually be a Galaxy admin or automatically installed by the Galaxy installation manager for those repositories that contain tools tha tused to be in the Galaxy distribution but have since been moved to the main Galaxy tool shed.
affected #: 7 files
diff -r 9d392eaca847bf26a8383babacc3c94e5a1bab9e -r fd6f22c072f3b2acbde3311943af004f1094a4fa lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -2663,7 +2663,8 @@
class ToolShedRepository( object ):
def __init__( self, id=None, create_time=None, tool_shed=None, name=None, description=None, owner=None, installed_changeset_revision=None,
- changeset_revision=None, metadata=None, includes_datatypes=False, update_available=False, deleted=False ):
+ changeset_revision=None, metadata=None, includes_datatypes=False, update_available=False, deleted=False, uninstalled=False,
+ dist_to_shed=False ):
self.id = id
self.create_time = create_time
self.tool_shed = tool_shed
@@ -2676,6 +2677,14 @@
self.includes_datatypes = includes_datatypes
self.update_available = update_available
self.deleted = deleted
+ self.uninstalled = uninstalled
+ self.dist_to_shed = dist_to_shed
+ @property
+ def includes_tools( self ):
+ return 'tools' in self.metadata
+ @property
+ def includes_workflows( self ):
+ return 'workflows' in self.metadata
class ToolIdGuidMap( object ):
def __init__( self, id=None, create_time=None, tool_id=None, tool_version=None, tool_shed=None, repository_owner=None, repository_name=None, guid=None ):
diff -r 9d392eaca847bf26a8383babacc3c94e5a1bab9e -r fd6f22c072f3b2acbde3311943af004f1094a4fa lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py
+++ b/lib/galaxy/model/mapping.py
@@ -378,7 +378,9 @@
Column( "metadata", JSONType, nullable=True ),
Column( "includes_datatypes", Boolean, index=True, default=False ),
Column( "update_available", Boolean, default=False ),
- Column( "deleted", Boolean, index=True, default=False ) )
+ Column( "deleted", Boolean, index=True, default=False ),
+ Column( "uninstalled", Boolean, default=False ),
+ Column( "dist_to_shed", Boolean, default=False ) )
ToolIdGuidMap.table = Table( "tool_id_guid_map", metadata,
Column( "id", Integer, primary_key=True ),
diff -r 9d392eaca847bf26a8383babacc3c94e5a1bab9e -r fd6f22c072f3b2acbde3311943af004f1094a4fa lib/galaxy/model/migrate/versions/0090_add_tool_shed_repository_table_columns.py
--- /dev/null
+++ b/lib/galaxy/model/migrate/versions/0090_add_tool_shed_repository_table_columns.py
@@ -0,0 +1,64 @@
+"""
+Migration script to add the uninstalled and dist_to_shed columns to the tool_shed_repository table.
+"""
+
+from sqlalchemy import *
+from sqlalchemy.orm import *
+from migrate import *
+from migrate.changeset import *
+
+import datetime
+now = datetime.datetime.utcnow
+# Need our custom types, but don't import anything else from model
+from galaxy.model.custom_types import *
+
+import sys, logging
+log = logging.getLogger( __name__ )
+log.setLevel(logging.DEBUG)
+handler = logging.StreamHandler( sys.stdout )
+format = "%(name)s %(levelname)s %(asctime)s %(message)s"
+formatter = logging.Formatter( format )
+handler.setFormatter( formatter )
+log.addHandler( handler )
+
+metadata = MetaData( migrate_engine )
+db_session = scoped_session( sessionmaker( bind=migrate_engine, autoflush=False, autocommit=True ) )
+
+def upgrade():
+ print __doc__
+ metadata.reflect()
+ ToolShedRepository_table = Table( "tool_shed_repository", metadata, autoload=True )
+ c = Column( "uninstalled", Boolean, default=False )
+ try:
+ c.create( ToolShedRepository_table )
+ assert c is ToolShedRepository_table.c.uninstalled
+ if migrate_engine.name == 'mysql' or migrate_engine.name == 'sqlite':
+ default_false = "0"
+ elif migrate_engine.name == 'postgres':
+ default_false = "false"
+ db_session.execute( "UPDATE tool_shed_repository SET uninstalled=%s" % default_false )
+ except Exception, e:
+ print "Adding uninstalled column to the tool_shed_repository table failed: %s" % str( e )
+ c = Column( "dist_to_shed", Boolean, default=False )
+ try:
+ c.create( ToolShedRepository_table )
+ assert c is ToolShedRepository_table.c.dist_to_shed
+ if migrate_engine.name == 'mysql' or migrate_engine.name == 'sqlite':
+ default_false = "0"
+ elif migrate_engine.name == 'postgres':
+ default_false = "false"
+ db_session.execute( "UPDATE tool_shed_repository SET dist_to_shed=%s" % default_false )
+ except Exception, e:
+ print "Adding dist_to_shed column to the tool_shed_repository table failed: %s" % str( e )
+
+def downgrade():
+ metadata.reflect()
+ ToolShedRepository_table = Table( "tool_shed_repository", metadata, autoload=True )
+ try:
+ ToolShedRepository_table.c.uninstalled.drop()
+ except Exception, e:
+ print "Dropping column uninstalled from the tool_shed_repository table failed: %s" % str( e )
+ try:
+ ToolShedRepository_table.c.dist_to_shed.drop()
+ except Exception, e:
+ print "Dropping column dist_to_shed from the tool_shed_repository table failed: %s" % str( e )
diff -r 9d392eaca847bf26a8383babacc3c94e5a1bab9e -r fd6f22c072f3b2acbde3311943af004f1094a4fa lib/galaxy/tool_shed/install_manager.py
--- a/lib/galaxy/tool_shed/install_manager.py
+++ b/lib/galaxy/tool_shed/install_manager.py
@@ -94,9 +94,9 @@
tool_shed=self.tool_shed,
tool_section=tool_section,
shed_tool_conf=self.install_tool_config,
- new_install=True )
- # Add a new record to the tool_id_guid_map table for each
- # tool in the repository if one doesn't already exist.
+ new_install=True,
+ dist_to_shed=True )
+ # Add a new record to the tool_id_guid_map table for each tool in the repository if one doesn't already exist.
if 'tools' in metadata_dict:
tools_mapped = 0
for tool_dict in metadata_dict[ 'tools' ]:
diff -r 9d392eaca847bf26a8383babacc3c94e5a1bab9e -r fd6f22c072f3b2acbde3311943af004f1094a4fa lib/galaxy/util/shed_util.py
--- a/lib/galaxy/util/shed_util.py
+++ b/lib/galaxy/util/shed_util.py
@@ -79,8 +79,11 @@
tool_dicts=tool_dicts,
converter_path=converter_path,
display_path=display_path )
-def create_or_update_tool_shed_repository( app, name, description, changeset_revision, repository_clone_url, metadata_dict, owner='' ):
- # This method is used by the InstallManager, which does not have access to trans.
+def create_or_update_tool_shed_repository( app, name, description, changeset_revision, repository_clone_url, metadata_dict, owner='', dist_to_shed=False ):
+ # This method is used by the InstallManager, which does not have access to trans. The
+ # received value for dist_to_shed will be True if the InstallManager is installing a repository
+ # that contains tools or datatypes that used to be in the Galaxy distribution, but have been
+ # moved to the main Galaxy tool shed.
sa_session = app.model.context.current
tmp_url = clean_repository_clone_url( repository_clone_url )
tool_shed = tmp_url.split( 'repos' )[ 0 ].rstrip( '/' )
@@ -94,6 +97,7 @@
tool_shed_repository.metadata = metadata_dict
tool_shed_repository.includes_datatypes = includes_datatypes
tool_shed_repository.deleted = False
+ tool_shed_repository.uninstalled = False
else:
tool_shed_repository = app.model.ToolShedRepository( tool_shed=tool_shed,
name=name,
@@ -102,7 +106,8 @@
installed_changeset_revision=changeset_revision,
changeset_revision=changeset_revision,
metadata=metadata_dict,
- includes_datatypes=includes_datatypes )
+ includes_datatypes=includes_datatypes,
+ dist_to_shed=dist_to_shed )
sa_session.add( tool_shed_repository )
sa_session.flush()
def generate_datatypes_metadata( datatypes_config, metadata_dict ):
@@ -584,16 +589,20 @@
app.datatypes_registry.load_datatypes( root_dir=app.config.root, config=datatypes_config, imported_modules=imported_modules, deactivate=deactivate )
return converter_path, display_path
def load_repository_contents( app, repository_name, description, owner, changeset_revision, tool_path, repository_clone_url, relative_install_dir,
- current_working_dir, tmp_name, tool_shed=None, tool_section=None, shed_tool_conf=None, new_install=True ):
- # This method is used by the InstallManager, which does not have access to trans.
- # Generate the metadata for the installed tool shed repository. It is imperative that
- # the installed repository is updated to the desired changeset_revision before metadata
- # is set because the process for setting metadata uses the repository files on disk. This
- # method is called when new tools have been installed (in which case values should be received
- # for tool_section and shed_tool_conf, and new_install should be left at it's default value)
- # and when updates have been pulled to previously installed repositories (in which case the
- # default value None is set for tool_section and shed_tool_conf, and the value of new_install
- # is passed as False).
+ current_working_dir, tmp_name, tool_shed=None, tool_section=None, shed_tool_conf=None, new_install=True, dist_to_shed=False ):
+ """
+ This method is used by the InstallManager (which does not have access to trans), to generate the metadata
+ for the installed tool shed repository, among other things. It is imperative that the installed repository
+ is updated to the desired changeset_revision before metadata is set because the process for setting metadata
+ uses the repository files on disk. This method is called when new tools have been installed (in which case
+ values should be received for tool_section and shed_tool_conf, and new_install should be left at it's default
+ value) and when updates have been pulled to previously installed repositories (in which case the default value
+ None is set for tool_section and shed_tool_conf, and the value of new_install is passed as False). When a new
+ install is being done by the InstallManager (and not a user manually installing a repository from the Admin
+ perspective), the value of dist_to_shed will be set to True, enabling determinatin of which installed repositories
+ resulted from the InstallManager installing a repository that contains tools that used to be in the Galaxy
+ distribution but are now in the main Galaxy tool shed.
+ """
if tool_section:
section_id = tool_section.id
section_version = tool_section.version
@@ -604,6 +613,17 @@
section_name = ''
tool_section_dict = dict( id=section_id, version=section_version, name=section_name )
metadata_dict = generate_metadata( app.toolbox, relative_install_dir, repository_clone_url, tool_section_dict=tool_section_dict )
+ # Add a new record to the tool_shed_repository table if one doesn't already exist. If one exists but is marked
+ # deleted, undelete it. It is imperative that this happens before the call to alter_tool_panel() below because
+ # tools will not be properly loaded if the repository is marked deleted.
+ log.debug( "Adding new row (or updating an existing row) for repository '%s' in the tool_shed_repository table." % repository_name )
+ create_or_update_tool_shed_repository( app,
+ repository_name,
+ description,
+ changeset_revision,
+ repository_clone_url,
+ metadata_dict,
+ dist_to_shed=dist_to_shed )
if 'tools' in metadata_dict:
repository_tools_tups = get_repository_tools_tups( app, metadata_dict )
if repository_tools_tups:
@@ -615,20 +635,22 @@
# Handle tools that use fabric scripts to install dependencies.
handle_tool_dependencies( current_working_dir, relative_install_dir, repository_tools_tups )
if new_install:
- load_altered_part_of_tool_panel( app=app,
- repository_name=repository_name,
- repository_clone_url=repository_clone_url,
- changeset_revision=changeset_revision,
- repository_tools_tups=repository_tools_tups,
- tool_section=tool_section,
- shed_tool_conf=shed_tool_conf,
- tool_path=tool_path,
- owner=owner,
- deactivate=False )
- else:
- if app.toolbox_search.enabled:
- # If search support for tools is enabled, index the new installed tools.
- app.toolbox_search = ToolBoxSearch( app.toolbox )
+ alter_tool_panel( app=app,
+ repository_name=repository_name,
+ repository_clone_url=repository_clone_url,
+ changeset_revision=changeset_revision,
+ repository_tools_tups=repository_tools_tups,
+ tool_section=tool_section,
+ shed_tool_conf=shed_tool_conf,
+ tool_path=tool_path,
+ owner=owner,
+ new_install=new_install,
+ deactivate=False,
+ uninstall=False )
+ elif app.toolbox_search.enabled:
+ # If search support for tools is enabled, index the new installed tools. In the
+ # condition above, this happens in the alter_tool_panel() method.
+ app.toolbox_search = ToolBoxSearch( app.toolbox )
# Remove the temporary file
try:
os.unlink( tmp_name )
@@ -653,16 +675,15 @@
if display_path:
# Load proprietary datatype display applications
app.datatypes_registry.load_display_applications( installed_repository_dict=repository_dict )
- # Add a new record to the tool_shed_repository table if one doesn't
- # already exist. If one exists but is marked deleted, undelete it.
- log.debug( "Adding new row (or updating an existing row) for repository '%s' in the tool_shed_repository table." % repository_name )
- create_or_update_tool_shed_repository( app, repository_name, description, changeset_revision, repository_clone_url, metadata_dict )
return metadata_dict
-def load_altered_part_of_tool_panel( app, repository_name, repository_clone_url, changeset_revision, repository_tools_tups,
- tool_section, shed_tool_conf, tool_path, owner, deactivate=False ):
- # We'll be changing the contents of the shed_tool_conf file on disk, so we need to
- # make the same changes to the in-memory version of that file, which is stored in
- # the config_elems entry in the shed_tool_conf_dict associated with the file.
+def alter_tool_panel( app, repository_name, repository_clone_url, changeset_revision, repository_tools_tups, tool_section,
+ shed_tool_conf, tool_path, owner, new_install=True, deactivate=False, uninstall=False ):
+ """
+ A tool shed repository is being installed / updated / deactivated / uninstalled,
+ so handle tool panel alterations accordingly.
+ """
+ # We need to change the in-memory version of the shed_tool_conf file, which is stored in the config_elems entry
+ # in the shed_tool_conf_dict associated with the file.
for index, shed_tool_conf_dict in enumerate( app.toolbox.shed_tool_confs ):
config_filename = shed_tool_conf_dict[ 'config_filename' ]
if config_filename == shed_tool_conf:
@@ -673,29 +694,91 @@
if tail == shed_tool_conf:
config_elems = shed_tool_conf_dict[ 'config_elems' ]
break
- # Generate a new entry for the tool config.
+ # Geneate the list of ElementTree Element objects for each section or list of tools.
elem_list = generate_tool_panel_elem_list( repository_name,
repository_clone_url,
changeset_revision,
repository_tools_tups,
tool_section=tool_section,
owner=owner )
- if tool_section:
- for section_elem in elem_list:
- # Load the section into the tool panel.
- app.toolbox.load_section_tag_set( section_elem, app.toolbox.tool_panel, tool_path )
+ if deactivate:
+ # Remove appropriate entries from the shed_tool_conf file on disk. We first create an list of
+ # guids for all tools that will be removed from the tool config.
+ tool_elements_removed = 0
+ guids_to_remove = [ repository_tool_tup[1] for repository_tool_tup in repository_tools_tups ]
+ if tool_section:
+ # Remove all appropriate tool sub-elements from the section element.
+ for section_elem in elem_list:
+ section_elem_id = section_elem.get( 'id' )
+ section_elem_name = section_elem.get( 'name' )
+ section_elem_version = section_elem.get( 'version' )
+ for config_elem in config_elems:
+ config_elems_to_remove = []
+ if config_elem.tag == 'section' and \
+ config_elem.get( 'id' ) == section_elem_id and \
+ config_elem.get( 'name' ) == section_elem_name and \
+ config_elem.get( 'version' ) == section_elem_version:
+ # We've located the section element in the in-memory list of config_elems, so we can remove
+ # all of the appropriate tools sub-elements from the section.
+ tool_elems_to_remove = []
+ for tool_elem in config_elem:
+ tool_elem_guid = tool_elem.get( 'guid' )
+ if tool_elem_guid in guids_to_remove:
+ tool_elems_to_remove.append( tool_elem )
+ for tool_elem in tool_elems_to_remove:
+ # Remove all of the appropriate tool sub-elements from the section element.
+ tool_elem_guid = tool_elem.get( 'guid' )
+ config_elem.remove( tool_elem )
+ log.debug( "Removed tool with guid '%s'." % str( tool_elem_guid ) )
+ tool_elements_removed += 1
+ if len( config_elem ) < 1:
+ # Keep a list of all empty section elements so they can be removed.
+ config_elems_to_remove.append( config_elem )
+ if tool_elements_removed == len( guids_to_remove ):
+ break
+ for config_elem in config_elems_to_remove:
+ # The section element includes no tool sub-elements, so remove it.
+ config_elems.remove( config_elem )
+ log.debug( "Removed section with id '%s'." % str( section_elem_id ) )
+ if tool_elements_removed == len( guids_to_remove ):
+ break
+ else:
+ # Remove all appropriate tool elements from the root (tools are outside of any sections).
+ tool_elems_to_remove = []
+ for tool_elem in elem_list:
+ tool_elem_guid = tool_elem.get( 'guid' )
+ for config_elem in config_elems:
+ if config_elem.tag == 'tool' and config_elem.get( 'guid' ) == tool_elem_guid:
+ tool_elems_to_remove.append( tool_elem )
+ for config_elem in config_elems:
+ for tool_elem in tool_elems_to_remove:
+ try:
+ # Remove the tool element from the in-memory list of config_elems.
+ config_elem.remove( tool_elem )
+ log.debug( "Removed tool with guid '%s'." % str( tool_elem_guid ) )
+ except:
+ # The tool_elem is not a sub-element of the current config_elem.
+ pass
else:
- # Load the tools into the tool panel outside of any sections.
- for tool_elem in elem_list:
- guid = tool_elem.get( 'guid' )
- app.toolbox.load_tool_tag_set( tool_elem, app.toolbox.tool_panel, tool_path=tool_path, guid=guid )
- if not deactivate:
- # Append the new entry (either section or list of tools) to the shed_tool_config file,
- # and add the xml element to the in-memory list of config_elems.
- for elem_entry in elem_list:
- config_elems.append( elem_entry )
- shed_tool_conf_dict[ 'config_elems' ] = config_elems
- app.toolbox.shed_tool_confs[ index ] = shed_tool_conf_dict
+ # Generate a new entry for the tool config.
+ if tool_section:
+ for section_elem in elem_list:
+ # Load the section into the tool panel.
+ app.toolbox.load_section_tag_set( section_elem, app.toolbox.tool_panel, tool_path )
+ else:
+ # Load the tools into the tool panel outside of any sections.
+ for tool_elem in elem_list:
+ guid = tool_elem.get( 'guid' )
+ app.toolbox.load_tool_tag_set( tool_elem, app.toolbox.tool_panel, tool_path=tool_path, guid=guid )
+ if new_install:
+ # Append the new entry (either section or list of tools) to the shed_tool_config file,
+ # and add the xml element to the in-memory list of config_elems.
+ for elem_entry in elem_list:
+ config_elems.append( elem_entry )
+ shed_tool_conf_dict[ 'config_elems' ] = config_elems
+ app.toolbox.shed_tool_confs[ index ] = shed_tool_conf_dict
+ if uninstall or not deactivate:
+ # Persist the altered in-memory version of the tool config.
config_elems_to_xml_file( app, shed_tool_conf_dict )
if app.toolbox_search.enabled:
# If search support for tools is enabled, index the new installed tools.
diff -r 9d392eaca847bf26a8383babacc3c94e5a1bab9e -r fd6f22c072f3b2acbde3311943af004f1094a4fa lib/galaxy/web/controllers/admin_toolshed.py
--- a/lib/galaxy/web/controllers/admin_toolshed.py
+++ b/lib/galaxy/web/controllers/admin_toolshed.py
@@ -280,7 +280,8 @@
tool_shed=tool_shed,
tool_section=tool_section,
shed_tool_conf=shed_tool_conf,
- new_install=True )
+ new_install=True,
+ dist_to_shed=False )
installed_repository_names.append( name )
else:
tmp_stderr = open( tmp_name, 'rb' )
@@ -332,57 +333,81 @@
@web.expose
@web.require_admin
def deactivate_or_uninstall_repository( self, trans, **kwd ):
- # TODO: handle elimination of workflows loaded into the tool panel that are in the repository being deactivated.
params = util.Params( kwd )
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
remove_from_disk = params.get( 'remove_from_disk', '' )
remove_from_disk_checked = CheckboxField.is_checked( remove_from_disk )
repository = get_repository( trans, kwd[ 'id' ] )
+ shed_tool_conf, tool_path, relative_install_dir = self.__get_tool_path_and_relative_install_dir( trans, repository )
if params.get( 'deactivate_or_uninstall_repository_button', False ):
- # Deatcivate repository tools.
- shed_tool_conf, tool_path, relative_install_dir = self.__get_tool_path_and_relative_install_dir( trans, repository )
metadata = repository.metadata
- repository_tools_tups = get_repository_tools_tups( trans.app, metadata )
- # Generate the list of tool panel keys derived from the tools included in the repository.
- repository_tool_panel_keys = []
- if repository_tools_tups:
- repository_tool_panel_keys = [ 'tool_%s' % repository_tools_tup[ 1 ] for repository_tools_tup in repository_tools_tups ]
- tool_panel_section = metadata[ 'tool_panel_section' ]
- section_id = tool_panel_section[ 'id' ]
- if section_id in [ '' ]:
- # If the repository includes tools, they were loaded into the tool panel outside of any sections.
- tool_section = None
- self.__deactivate( trans, repository_tool_panel_keys )
- else:
- # If the repository includes tools, they were loaded into the tool panel inside a section.
- section_key = 'section_%s' % str( section_id )
- if section_key in trans.app.toolbox.tool_panel:
- tool_section = trans.app.toolbox.tool_panel[ section_key ]
- self.__deactivate( trans, repository_tool_panel_keys, tool_section=tool_section, section_key=section_key )
+ if repository.includes_tools:
+ # Deactivate repository tools.
+ repository_tools_tups = get_repository_tools_tups( trans.app, metadata )
+ # Generate the list of tool panel keys derived from the tools included in the repository.
+ repository_tool_panel_keys = []
+ if repository_tools_tups:
+ repository_tool_panel_keys = [ 'tool_%s' % repository_tools_tup[ 1 ] for repository_tools_tup in repository_tools_tups ]
+ tool_panel_section = metadata[ 'tool_panel_section' ]
+ section_id = tool_panel_section[ 'id' ]
+ if section_id in [ '' ]:
+ # If the repository includes tools, they were loaded into the tool panel outside of any sections.
+ tool_section = None
+ self.__remove_tools_from_tool_panel( trans, repository_tool_panel_keys )
else:
- # The tool panel section could not be found, so handle deactivating tools
- # as if they were loaded into the tool panel outside of any sections.
- tool_section = None
- self.__deactivate( trans, repository_tool_panel_keys )
+ # If the repository includes tools, they were loaded into the tool panel inside a section.
+ section_key = 'section_%s' % str( section_id )
+ if section_key in trans.app.toolbox.tool_panel:
+ tool_section = trans.app.toolbox.tool_panel[ section_key ]
+ self.__remove_tools_from_tool_panel( trans, repository_tool_panel_keys, tool_section=tool_section, section_key=section_key )
+ else:
+ # The tool panel section could not be found, so handle deactivating tools
+ # as if they were loaded into the tool panel outside of any sections.
+ tool_section = None
+ self.__remove_tools_from_tool_panel( trans, repository_tool_panel_keys )
+ repository_clone_url = self.__generate_clone_url( trans, repository )
+ # The repository is either being deactivated or uninstalled, so handle tool panel alterations accordingly.
+ # If the repository is being uninstalled, the appropriate tools or tool sections will be removed from the
+ # appropriate tool config file on disk.
+ alter_tool_panel( app=trans.app,
+ repository_name=repository.name,
+ repository_clone_url=repository_clone_url,
+ changeset_revision=repository.installed_changeset_revision,
+ repository_tools_tups=repository_tools_tups,
+ tool_section=tool_section,
+ shed_tool_conf=shed_tool_conf,
+ tool_path=tool_path,
+ owner=repository.owner,
+ new_install=False,
+ deactivate=True,
+ uninstall=remove_from_disk_checked )
+ if repository.includes_datatypes:
+ # Deactivate proprietary datatypes.
+ load_datatype_items( trans.app, repository, relative_install_dir, deactivate=True )
+ if remove_from_disk_checked:
+ # Remove the repository from disk.
+ try:
+ shutil.rmtree( relative_install_dir )
+ log.debug( "Removed repository installation directory: %s" % str( relative_install_dir ) )
+ except Exception, e:
+ log.debug( "Error removing repository installation directory %s: %s" % ( str( relative_install_dir ), str( e ) ) )
+ # If the repository was installed by the InstallManager, remove
+ # all appropriate rows from the tool_id_guid_map database table.
+ if repository.dist_to_shed:
+ count = 0
+ for tool_id_guid_map in trans.sa_session.query( trans.model.ToolIdGuidMap ) \
+ .filter( and_( trans.model.ToolIdGuidMap.table.c.tool_shed==repository.tool_shed,
+ trans.model.ToolIdGuidMap.table.c.repository_owner==repository.owner,
+ trans.model.ToolIdGuidMap.table.c.repository_name==repository.name ) ):
+ trans.sa_session.delete( tool_id_guid_map )
+ count += 1
+ log.debug( "Removed %d rows from the tool_id_guid_map database table." % count )
+ repository.uninstalled = True
repository.deleted = True
trans.sa_session.add( repository )
trans.sa_session.flush()
- repository_clone_url = self.__generate_clone_url( trans, repository )
- load_altered_part_of_tool_panel( app=trans.app,
- repository_name=repository.name,
- repository_clone_url=repository_clone_url,
- changeset_revision=repository.installed_changeset_revision,
- repository_tools_tups=repository_tools_tups,
- tool_section=tool_section,
- shed_tool_conf=shed_tool_conf,
- tool_path=tool_path,
- owner=repository.owner,
- deactivate=True )
- # Deactivate proprietary datatypes.
- load_datatype_items( trans.app, repository, relative_install_dir, deactivate=True )
if remove_from_disk_checked:
- # TODO: Remove repository from disk and alter the shed tool config.
message = 'The repository named <b>%s</b> has been uninstalled.' % repository.name
else:
message = 'The repository named <b>%s</b> has been deactivated.' % repository.name
@@ -397,10 +422,10 @@
remove_from_disk_check_box=remove_from_disk_check_box,
message=message,
status=status )
- def __deactivate( self, trans, repository_tool_panel_keys, tool_section=None, section_key=None ):
- # Delete tools loaded into the tool panel. If the tool_panel is received,
- # the tools were loaded into the tool panel outside of any sections. If
- # the tool_section is received, all tools were loaded into that section.
+ def __remove_tools_from_tool_panel( self, trans, repository_tool_panel_keys, tool_section=None, section_key=None ):
+ # Delete tools loaded into the tool panel by altering the in-memory tool_panel dictionary appropriately. If the
+ # tool_section is not received, the tools were loaded into the tool_panel dictionary outside of any sections.
+ # Otherwise all tools were loaded into the received tool_section.
if tool_section:
# The tool_section.elems dictionary looks something like:
# {'tool_gvk.bx.psu.edu:9009/repos/test/filter/Filter1/1.0.1': <galaxy.tools.Tool instance at 0x10769ae60>}
@@ -418,61 +443,146 @@
@web.expose
@web.require_admin
def activate_or_reinstall_repository( self, trans, **kwd ):
- # TODO: handle re-introduction of workflows loaded into the tool panel that are in the repository being activated.
repository = get_repository( trans, kwd[ 'id' ] )
- repository.deleted = False
- trans.sa_session.add( repository )
- trans.sa_session.flush()
shed_tool_conf, tool_path, relative_install_dir = self.__get_tool_path_and_relative_install_dir( trans, repository )
- metadata = repository.metadata
- repository_tools_tups = get_repository_tools_tups( trans.app, metadata )
- tool_panel_section = metadata[ 'tool_panel_section' ]
- section_id = tool_panel_section[ 'id' ]
- if section_id in [ '' ]:
- # If the repository includes tools, reload them into the tool panel outside of any sections.
- self.__activate( trans, repository, repository_tools_tups, tool_section=None, section_key=None )
+ uninstalled = repository.uninstalled
+ if uninstalled:
+ if repository.dist_to_shed:
+ href = '<a href="http://wiki.g2.bx.psu.edu/Tool%20Shed#Migrating_tools_from_the_Galaxy_distr…" '
+ href += 'target="_blank">in this section of the Galaxy Tool Shed wiki</a>'
+ message = "The <b>%s</b> repository should be reinstalled using the approach described %s. " % ( repository.name, href )
+ message += "If <b>enable_tool_shed_install = True</b> and the contents of the file configured for the "
+ message += "<b>tool_shed_install_config_file</b> setting in your universe_wsgi.ini file enable installation of the "
+ message += "<b>%s</b> repository, then restarting your Galaxy server will reinstall the repository." % repository.name
+ new_kwd = {}
+ new_kwd[ 'sort' ] = 'name'
+ new_kwd[ 'f-deleted' ] = 'True'
+ new_kwd[ 'message' ] = message
+ new_kwd[ 'status' ] = 'error'
+ return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
+ action='browse_repositories',
+ **new_kwd ) )
+ else:
+ current_working_dir = os.getcwd()
+ repository_clone_url = self.__generate_clone_url( trans, repository )
+ clone_dir = os.path.join( tool_path, self.__generate_tool_path( repository_clone_url, repository.installed_changeset_revision ) )
+ relative_install_dir = os.path.join( clone_dir, repository.name )
+ returncode, tmp_name = clone_repository( repository.name, clone_dir, current_working_dir, repository_clone_url )
+ if returncode == 0:
+ returncode, tmp_name = update_repository( current_working_dir, relative_install_dir, repository.installed_changeset_revision )
+ if returncode == 0:
+ # Get the location in the tool panel in which the tool was originally loaded.
+ metadata = repository.metadata
+ tool_panel_section = metadata[ 'tool_panel_section' ]
+ original_section_id = tool_panel_section[ 'id' ]
+ original_section_name = tool_panel_section[ 'name' ]
+ if original_section_id in [ '' ]:
+ tool_section = None
+ else:
+ section_key = 'section_%s' % str( original_section_id )
+ if section_key in trans.app.toolbox.tool_panel:
+ tool_section = trans.app.toolbox.tool_panel[ section_key ]
+ else:
+ # The section in which the tool was originally loaded used to be in the tool panel, but no longer is.
+ elem = Element( 'section' )
+ elem.attrib[ 'name' ] = original_section_name
+ elem.attrib[ 'id' ] = original_section_id
+ elem.attrib[ 'version' ] = ''
+ tool_section = tools.ToolSection( elem )
+ trans.app.toolbox.tool_panel[ section_key ] = tool_section
+ metadata_dict = load_repository_contents( app=trans.app,
+ repository_name=repository.name,
+ description=repository.description,
+ owner=repository.owner,
+ changeset_revision=repository.installed_changeset_revision,
+ tool_path=tool_path,
+ repository_clone_url=repository_clone_url,
+ relative_install_dir=relative_install_dir,
+ current_working_dir=current_working_dir,
+ tmp_name=tmp_name,
+ tool_shed=repository.tool_shed,
+ tool_section=tool_section,
+ shed_tool_conf=shed_tool_conf,
+ new_install=True,
+ dist_to_shed=False )
+ repository.uninstalled = False
+ repository.deleted = False
+ trans.sa_session.add( repository )
+ trans.sa_session.flush()
else:
- # If the repository includes tools, reload them into the appropriate tool panel section.
- section_key = 'section_%s' % str( section_id )
- if section_key in trans.app.toolbox.tool_panel:
- # Load the repository tools into a section that still exists in the tool panel.
- tool_section = trans.app.toolbox.tool_panel[ section_key ]
- self.__activate( trans, repository, repository_tools_tups, tool_section=tool_section, section_key=section_key )
- else:
- # Load the repository tools into a section that no longer exists in the tool panel.
- # We first see if the section exists in the tool config, which will be the case if
- # the repository was only deactivated and not uninstalled.
- section_loaded = False
- for tcf in trans.app.config.tool_configs:
- # Only inspect tool configs that contain installed tool shed repositories.
- if tcf not in [ 'tool_conf.xml' ]:
- log.info( "Parsing the tool configuration %s" % tcf )
- tree = util.parse_xml( tcf )
- root = tree.getroot()
- tool_path = root.get( 'tool_path' )
- if tool_path is not None:
- # Tool configs that contain tools installed from tool shed repositories
- # must have a tool_path attribute.
- for elem in root:
- if elem.tag == 'section':
- id_attr = elem.get( 'id' )
- if id_attr == section_id:
- tool_section = elem
- # Load the tools.
- trans.app.toolbox.load_section_tag_set( tool_section, trans.app.toolbox.tool_panel, tool_path )
- section_loaded = True
+ # The repository was deactivated, but not uninstalled.
+ repository.deleted = False
+ trans.sa_session.add( repository )
+ trans.sa_session.flush()
+ if repository.includes_tools:
+ metadata = repository.metadata
+ repository_tools_tups = get_repository_tools_tups( trans.app, metadata )
+ guids_to_activate = [ repository_tool_tup[1] for repository_tool_tup in repository_tools_tups ]
+ tool_panel_section = metadata[ 'tool_panel_section' ]
+ original_section_id = tool_panel_section[ 'id' ]
+ if original_section_id in [ '' ]:
+ # If the repository includes tools, reload them into the tool panel outside of any sections.
+ self.__add_tools_to_tool_panel( trans, repository, repository_tools_tups, tool_section=None, section_key=None )
+ else:
+ original_section_id = tool_panel_section[ 'id' ]
+ original_section_name = tool_panel_section[ 'name' ]
+ original_section_version = tool_panel_section[ 'version' ]
+ # If the repository includes tools, reload them into the appropriate tool panel section.
+ section_key = 'section_%s' % str( original_section_id )
+ if section_key in trans.app.toolbox.tool_panel:
+ # Load the repository tools into a section that still exists in the tool panel.
+ tool_section = trans.app.toolbox.tool_panel[ section_key ]
+ self.__add_tools_to_tool_panel( trans, repository, repository_tools_tups, tool_section=tool_section, section_key=section_key )
+ else:
+ # Load the repository tools into a section that no longer exists in the tool panel. The section must
+ # still exist in the tool config since the repository was only deactivated and not uninstalled.
+ sections_to_load = []
+ tool_elems_found = 0
+ # Only inspect tool configs that contain installed tool shed repositories.
+ for shed_tool_conf_dict in trans.app.toolbox.shed_tool_confs:
+ config_filename = shed_tool_conf_dict[ 'config_filename' ]
+ log.info( "Parsing the tool configuration %s" % config_filename )
+ tree = util.parse_xml( config_filename )
+ root = tree.getroot()
+ tool_path = root.get( 'tool_path' )
+ if tool_path is not None:
+ # Tool configs that contain tools installed from tool shed repositories
+ # must have a tool_path attribute.
+ for elem in root:
+ if elem.tag == 'section' and \
+ elem.get( 'id' ) == original_section_id and \
+ elem.get( 'name' ) == original_section_name and \
+ elem.get( 'version' ) == original_section_version:
+ # We've found the section, but we have to make sure it contains the
+ # correct tool tag set. This is necessary because the shed tool configs
+ # can include multiple sections of the same id, name and version, each
+ # containing one or more tool tag sets.
+ for tool_elem in elem:
+ if tool_elem.get( 'guid' ) in guids_to_activate:
+ tool_elems_found += 1
+ if elem not in sections_to_load:
+ sections_to_load.append( elem )
+ if tool_elems_found == len( guids_to_activate ):
+ break
+ if tool_elems_found == len( guids_to_activate ):
break
- if section_loaded:
- break
- # Load proprietary datatypes.
- load_datatype_items( trans.app, repository, relative_install_dir )
- message = 'The repository named <b>%s</b> has been activated.' % repository.name
+ if tool_elems_found == len( guids_to_activate ):
+ break
+ for elem in sections_to_load:
+ trans.app.toolbox.load_section_tag_set( elem, trans.app.toolbox.tool_panel, tool_path )
+ if repository.includes_datatypes:
+ # Load proprietary datatypes.
+ load_datatype_items( trans.app, repository, relative_install_dir )
+ if uninstalled:
+ message = 'The <b>%s</b> repository has been reinstalled.' % repository.name
+ else:
+ message = 'The <b>%s</b> repository has been activated.' % repository.name
status = 'done'
return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
action='browse_repositories',
message=message,
status=status ) )
- def __activate( self, trans, repository, repository_tools_tups, tool_section=None, section_key=None ):
+ def __add_tools_to_tool_panel( self, trans, repository, repository_tools_tups, tool_section=None, section_key=None ):
# Load tools.
if tool_section:
elems = tool_section.elems
diff -r 9d392eaca847bf26a8383babacc3c94e5a1bab9e -r fd6f22c072f3b2acbde3311943af004f1094a4fa templates/admin/tool_shed_repository/deactivate_or_uninstall_repository.mako
--- a/templates/admin/tool_shed_repository/deactivate_or_uninstall_repository.mako
+++ b/templates/admin/tool_shed_repository/deactivate_or_uninstall_repository.mako
@@ -43,30 +43,52 @@
${repository.deleted}
</div><div class="form-row">
- ${remove_from_disk_check_box.get_html( disabled=True )}
- <label for="repository" style="display: inline;font-weight:normal;">Check to uninstall (not yet implemented) or leave blank to deactivate</label>
+ ${remove_from_disk_check_box.get_html()}
+ <label for="repository" style="display: inline;font-weight:normal;">Check to uninstall or leave blank to deactivate</label><br/><br/><label>Deactivating this repository will result in the following:</label><div class="toolParamHelp" style="clear: both;">
- 1. This repository record's deleted column in the tool_shed_repository database table will be set to True.
+ * The repository and all of it's contents will remain on disk.
</div>
+ %if repository.includes_tools:
+ <div class="toolParamHelp" style="clear: both;">
+ * The repository's tools will not be loaded into the tool panel.
+ </div>
+ %endif
+ %if repository.includes_datatypes:
+ <div class="toolParamHelp" style="clear: both;">
+ * The repository's datatypes, datatype converters and display applications will be eliminated from the datatypes registry.
+ </div>
+ %endif
<div class="toolParamHelp" style="clear: both;">
- 2. The repository and all of it's contents will remain on disk.
- </div>
- <div class="toolParamHelp" style="clear: both;">
- 3. If this repository includes tools, they will not be loaded into the tool panel, but the tool config file in which they are defined will not be altered.
+ * The repository record's deleted column in the tool_shed_repository database table will be set to True.
</div><br/>
- <label>Uninstalling (not yet implemented) this repository will result in the following:</label>
+ <label>Uninstalling this repository will result in the following:</label><div class="toolParamHelp" style="clear: both;">
- 1. This repository record's deleted column in the tool_shed_repository database table will be set to True.
+ * The repository and all of it's contents will be removed from disk.
</div>
+ %if repository.includes_tools:
+ <div class="toolParamHelp" style="clear: both;">
+ * The repository's tool tag sets will be removed from the tool config file in which they are defined.
+ </div>
+ %endif
+ %if repository.includes_datatypes:
+ <div class="toolParamHelp" style="clear: both;">
+ * The repository's datatypes, datatype converters and display applications will be eliminated from the datatypes registry.
+ </div>
+ %endif
<div class="toolParamHelp" style="clear: both;">
- 2. The repository and all of it's contents will be removed from disk.
+ * The repository record's deleted column in the tool_shed_repository database table will be set to True.
</div>
- <div class="toolParamHelp" style="clear: both;">
- 3. If this repository includes tools, they will be removed from the tool config file in which they are defined and they will not be loaded into the tool panel.
- </div>
+ %if repository.dist_to_shed:
+ <div class="toolParamHelp" style="clear: both;">
+ * The repository record's uninstalled column in the tool_shed_repository database table will be set to True.
+ </div>
+ <div class="toolParamHelp" style="clear: both;">
+ * All records associated with this repository will be eliminated from the tool_id_guid_map database table.
+ </div>
+ %endif
<div style="clear: both"></div></div><div class="form-row">
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Add sort tag to Print Reads test output element.
by Bitbucket 25 Jan '12
by Bitbucket 25 Jan '12
25 Jan '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/12addb0e91d0/
changeset: 12addb0e91d0
user: dan
date: 2012-01-25 14:54:16
summary: Add sort tag to Print Reads test output element.
affected #: 1 file
diff -r 74343d5f5bda092f0fdad459c391bc0c624dc26e -r 12addb0e91d0911a0553a312497082a4aa395db6 tools/gatk/print_reads.xml
--- a/tools/gatk/print_reads.xml
+++ b/tools/gatk/print_reads.xml
@@ -373,7 +373,7 @@
<param name="sample_file_repeat" value="0" /><param name="sample_name_repeat" value="0" /><param name="gatk_param_type_selector" value="basic" />
- <output name="output_bam" file="gatk/gatk_table_recalibration/gatk_table_recalibration_out_1.bam" ftype="bam" />
+ <output name="output_bam" file="gatk/gatk_table_recalibration/gatk_table_recalibration_out_1.bam" ftype="bam" sort="True"/><output name="output_log" file="gatk/gatk_print_reads/gatk_print_reads_out_1.log.contains" compare="contains" /></test></tests>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/b523f22e1bfc/
changeset: b523f22e1bfc
user: dan
date: 2012-01-24 22:29:28
summary: Add GATK Print Reads walker tool.
affected #: 3 files
diff -r 82797507cbc95c5a5123bdb0a77bce7bb0e81727 -r b523f22e1bfc679ef490eb451844e33df391e6da test-data/gatk/gatk_print_reads/gatk_print_reads_out_1.log.contains
--- /dev/null
+++ b/test-data/gatk/gatk_print_reads/gatk_print_reads_out_1.log.contains
@@ -0,0 +1,5 @@
+SAMDataSource$SAMReaders - Initializing SAMRecords in serial
+TraversalEngine - [INITIALIZATION COMPLETE; TRAVERSAL STARTING]
+TraversalEngine - Location processed.reads runtime per.1M.reads completed total.runtime remaining
+Walker - [REDUCE RESULT] Traversal result is: org.broadinstitute.sting.gatk.io.stubs.SAMFileWriterStub
+TraversalEngine - 0 reads were filtered out during traversal out of 10 total (0.00%)
diff -r 82797507cbc95c5a5123bdb0a77bce7bb0e81727 -r b523f22e1bfc679ef490eb451844e33df391e6da tool_conf.xml.sample
--- a/tool_conf.xml.sample
+++ b/tool_conf.xml.sample
@@ -385,6 +385,7 @@
<section name="NGS: GATK Tools (beta)" id="gatk"><label text="Alignment Utilities" id="gatk_bam_utilities"/><tool file="gatk/depth_of_coverage.xml" />
+ <tool file="gatk/print_reads.xml" /><label text="Realignment" id="gatk_realignment" /><tool file="gatk/realigner_target_creator.xml" />
diff -r 82797507cbc95c5a5123bdb0a77bce7bb0e81727 -r b523f22e1bfc679ef490eb451844e33df391e6da tools/gatk/print_reads.xml
--- /dev/null
+++ b/tools/gatk/print_reads.xml
@@ -0,0 +1,424 @@
+<tool id="gatk_print_reads" name="Print Reads" version="0.0.1">
+ <description>from BAM files</description>
+ <requirements>
+ <requirement type="package" version="1.4">gatk</requirement>
+ </requirements>
+ <command interpreter="python">gatk_wrapper.py
+ --max_jvm_heap_fraction "1"
+ --stdout "${output_log}"
+ #for $i, $input_bam in enumerate( $reference_source.input_bams ):
+ -d "-I" "${input_bam.input_bam}" "${input_bam.input_bam.ext}" "gatk_input_${i}"
+ -d "" "${input_bam.input_bam.metadata.bam_index}" "bam_index" "gatk_input_${i}" ##hardcode galaxy ext type as bam_index
+ #end for
+ -p 'java
+ -jar "${GALAXY_DATA_INDEX_DIR}/shared/jars/gatk/GenomeAnalysisTK.jar"
+ -T "PrintReads"
+ ##--num_threads 4 ##hard coded, for now
+ --out "${output_bam}"
+ -et "NO_ET" ##ET no phone home
+ #if $reference_source.reference_source_selector != "history":
+ -R "${reference_source.ref_file.fields.path}"
+ #end if
+ --number "${number}"
+ #if $platform:
+ --platform "${platform}"
+ #end if
+ #if $read_group:
+ --readGroup "${read_group}"
+ #end if
+ #for $sample_file in $sample_file_repeat:
+ --sample_file "${sample_file.input_sample_file}"
+ #end for
+ #for $sample_name in $sample_name_repeat:
+ --sample_name "${sample_name.sample_name}"
+ #end for
+ '
+
+ ##start standard gatk options
+ #if $gatk_param_type.gatk_param_type_selector == "advanced":
+ #for $pedigree in $gatk_param_type.pedigree:
+ -p '--pedigree "${pedigree.pedigree_file}"'
+ #end for
+ #for $pedigree_string in $gatk_param_type.pedigree_string_repeat:
+ -p '--pedigreeString "${pedigree_string.pedigree_string}"'
+ #end for
+ -p '--pedigreeValidationType "${gatk_param_type.pedigree_validation_type}"'
+ #for $read_filter in $gatk_param_type.read_filter:
+ -p '--read_filter "${read_filter.read_filter_type.read_filter_type_selector}"
+ ###raise Exception( str( dir( $read_filter ) ) )
+ #for $name, $param in $read_filter.read_filter_type.iteritems():
+ #if $name not in [ "__current_case__", "read_filter_type_selector" ]:
+ #if hasattr( $param.input, 'truevalue' ):
+ ${param}
+ #else:
+ --${name} "${param}"
+ #end if
+ #end if
+ #end for
+ '
+ #end for
+ #for $interval_count, $input_intervals in enumerate( $gatk_param_type.input_interval_repeat ):
+ -d "--intervals" "${input_intervals.input_intervals}" "${input_intervals.input_intervals.ext}" "input_intervals_${interval_count}"
+ #end for
+
+ #for $interval_count, $input_intervals in enumerate( $gatk_param_type.input_exclude_interval_repeat ):
+ -d "--excludeIntervals" "${input_intervals.input_exclude_intervals}" "${input_intervals.input_exclude_intervals.ext}" "input_exlude_intervals_${interval_count}"
+ #end for
+
+ -p '--interval_set_rule "${gatk_param_type.interval_set_rule}"'
+
+ -p '--downsampling_type "${gatk_param_type.downsampling_type.downsampling_type_selector}"'
+ #if str( $gatk_param_type.downsampling_type.downsampling_type_selector ) != "NONE":
+ -p '--${gatk_param_type.downsampling_type.downsample_to_type.downsample_to_type_selector} "${gatk_param_type.downsampling_type.downsample_to_type.downsample_to_value}"'
+ #end if
+ -p '
+ --baq "${gatk_param_type.baq}"
+ --baqGapOpenPenalty "${gatk_param_type.baq_gap_open_penalty}"
+ ${gatk_param_type.use_original_qualities}
+ --defaultBaseQualities "${gatk_param_type.default_base_qualities}"
+ --validation_strictness "${gatk_param_type.validation_strictness}"
+ --interval_merging "${gatk_param_type.interval_merging}"
+ ${gatk_param_type.disable_experimental_low_memory_sharding}
+ ${gatk_param_type.non_deterministic_random_seed}
+ '
+ #for $rg_black_list_count, $rg_black_list in enumerate( $gatk_param_type.read_group_black_list_repeat ):
+ #if $rg_black_list.read_group_black_list_type.read_group_black_list_type_selector == "file":
+ -d "--read_group_black_list" "${rg_black_list.read_group_black_list_type.read_group_black_list}" "txt" "input_read_group_black_list_${rg_black_list_count}"
+ #else
+ -p '--read_group_black_list "${rg_black_list.read_group_black_list_type.read_group_black_list}"'
+ #end if
+ #end for
+ #end if
+
+ #if $reference_source.reference_source_selector == "history":
+ -d "-R" "${reference_source.ref_file}" "${reference_source.ref_file.ext}" "gatk_input"
+ #end if
+ ##end standard gatk options
+
+ </command>
+ <inputs>
+ <conditional name="reference_source">
+ <param name="reference_source_selector" type="select" label="Choose the source for the reference list">
+ <option value="cached">Locally cached</option>
+ <option value="history">History</option>
+ </param>
+ <when value="cached">
+ <repeat name="input_bams" title="Sample BAM file" min="1">
+ <param name="input_bam" type="data" format="bam" label="BAM file">
+ <validator type="unspecified_build" />
+ <validator type="metadata" check="bam_index" message="Metadata missing, click the pencil icon in the history item and use the auto-detect feature to correct this issue."/>
+ <validator type="dataset_metadata_in_data_table" table_name="gatk_picard_indexes" metadata_name="dbkey" metadata_column="dbkey" message="Sequences are not currently available for the specified build." /><!-- fixme!!! this needs to be a select -->
+ </param>
+ </repeat>
+ <param name="ref_file" type="select" label="Using reference genome">
+ <options from_data_table="gatk_picard_indexes">
+ <!-- <filter type="data_meta" key="dbkey" ref="input_bam" column="dbkey"/> does not yet work in a repeat...-->
+ </options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/>
+ </param>
+ </when>
+ <when value="history"><!-- FIX ME!!!! -->
+ <repeat name="input_bams" title="Sample BAM file" min="1">
+ <param name="input_bam" type="data" format="bam" label="BAM file" >
+ <validator type="metadata" check="bam_index" message="Metadata missing, click the pencil icon in the history item and use the auto-detect feature to correct this issue."/>
+ </param>
+ </repeat>
+ <param name="ref_file" type="data" format="fasta" label="Using reference file" />
+ </when>
+ </conditional>
+
+ <param name="number" type="integer" value="-1" label="Print the first n reads from the file, discarding the rest" />
+ <param name="platform" type="text" value="" label="Exclude all reads with this platform from the output" />
+ <param name="read_group" type="text" value="" label="Exclude all reads with this read group from the output" />
+ <repeat name="sample_file_repeat" title="File containing a list of samples to include">
+ <param name="input_sample_file" type="data" format="text" label="Sample file" />
+ </repeat>
+ <repeat name="sample_name_repeat" title="Sample name to be included in the analysis">
+ <param name="sample_name" type="text" label="Sample name" />
+ </repeat>
+
+ <conditional name="gatk_param_type">
+ <param name="gatk_param_type_selector" type="select" label="Basic or Advanced GATK options">
+ <option value="basic" selected="True">Basic</option>
+ <option value="advanced">Advanced</option>
+ </param>
+ <when value="basic">
+ <!-- Do nothing here -->
+ </when>
+ <when value="advanced">
+ <repeat name="pedigree" title="Pedigree file">
+ <param name="pedigree_file" type="data" format="txt" label="Pedigree files for samples" />
+ </repeat>
+ <repeat name="pedigree_string_repeat" title="Pedigree string">
+ <param name="pedigree_string" type="text" value="" label="Pedigree string for samples" />
+ </repeat>
+ <param name="pedigree_validation_type" type="select" label="How strict should we be in validating the pedigree information">
+ <option value="STRICT" selected="True">STRICT</option>
+ <option value="SILENT">SILENT</option>
+ </param>
+ <repeat name="read_filter" title="Read Filter">
+ <conditional name="read_filter_type">
+ <param name="read_filter_type_selector" type="select" label="Read Filter Type">
+ <option value="BadCigar">BadCigar</option>
+ <option value="BadMate">BadMate</option>
+ <option value="DuplicateRead">DuplicateRead</option>
+ <option value="FailsVendorQualityCheck">FailsVendorQualityCheck</option>
+ <option value="MalformedRead">MalformedRead</option>
+ <option value="MappingQuality">MappingQuality</option>
+ <option value="MappingQualityUnavailable">MappingQualityUnavailable</option>
+ <option value="MappingQualityZero">MappingQualityZero</option>
+ <option value="MateSameStrand">MateSameStrand</option>
+ <option value="MaxInsertSize">MaxInsertSize</option>
+ <option value="MaxReadLength" selected="True">MaxReadLength</option>
+ <option value="MissingReadGroup">MissingReadGroup</option>
+ <option value="NoOriginalQualityScores">NoOriginalQualityScores</option>
+ <option value="NotPrimaryAlignment">NotPrimaryAlignment</option>
+ <option value="Platform454">Platform454</option>
+ <option value="Platform">Platform</option>
+ <option value="PlatformUnit">PlatformUnit</option>
+ <option value="ReadGroupBlackList">ReadGroupBlackList</option>
+ <option value="ReadName">ReadName</option>
+ <option value="ReadStrand">ReadStrand</option>
+ <option value="ReassignMappingQuality">ReassignMappingQuality</option>
+ <option value="Sample">Sample</option>
+ <option value="SingleReadGroup">SingleReadGroup</option>
+ <option value="UnmappedRead">UnmappedRead</option>
+ </param>
+ <when value="BadCigar">
+ <!-- no extra options -->
+ </when>
+ <when value="BadMate">
+ <!-- no extra options -->
+ </when>
+ <when value="DuplicateRead">
+ <!-- no extra options -->
+ </when>
+ <when value="FailsVendorQualityCheck">
+ <!-- no extra options -->
+ </when>
+ <when value="MalformedRead">
+ <!-- no extra options -->
+ </when>
+ <when value="MappingQuality">
+ <param name="min_mapping_quality_score" type="integer" value="10" label="Minimum read mapping quality required to consider a read for calling"/>
+ </when>
+ <when value="MappingQualityUnavailable">
+ <!-- no extra options -->
+ </when>
+ <when value="MappingQualityZero">
+ <!-- no extra options -->
+ </when>
+ <when value="MateSameStrand">
+ <!-- no extra options -->
+ </when>
+ <when value="MaxInsertSize">
+ <param name="maxInsertSize" type="integer" value="1000000" label="Discard reads with insert size greater than the specified value"/>
+ </when>
+ <when value="MaxReadLength">
+ <param name="maxReadLength" type="integer" value="76" label="Max Read Length"/>
+ </when>
+ <when value="MissingReadGroup">
+ <!-- no extra options -->
+ </when>
+ <when value="NoOriginalQualityScores">
+ <!-- no extra options -->
+ </when>
+ <when value="NotPrimaryAlignment">
+ <!-- no extra options -->
+ </when>
+ <when value="Platform454">
+ <!-- no extra options -->
+ </when>
+ <when value="Platform">
+ <param name="PLFilterName" type="text" value="" label="Discard reads with RG:PL attribute containing this string"/>
+ </when>
+ <when value="PlatformUnit">
+ <!-- no extra options -->
+ </when>
+ <when value="ReadGroupBlackList">
+ <!-- no extra options -->
+ </when>
+ <when value="ReadName">
+ <param name="readName" type="text" value="" label="Filter out all reads except those with this read name"/>
+ </when>
+ <when value="ReadStrand">
+ <param name="filterPositive" type="boolean" truevalue="--filterPositive" falsevalue="" label="Discard reads on the forward strand"/>
+ </when>
+ <when value="ReassignMappingQuality">
+ <param name="default_mapping_quality" type="integer" value="60" label="Default read mapping quality to assign to all reads"/>
+ </when>
+ <when value="Sample">
+ <param name="sample_to_keep" type="text" value="" label="The name of the sample(s) to keep, filtering out all others"/>
+ </when>
+ <when value="SingleReadGroup">
+ <param name="read_group_to_keep" type="integer" value="76" label="The name of the read group to keep, filtering out all others"/>
+ </when>
+ <when value="UnmappedRead">
+ <!-- no extra options -->
+ </when>
+ </conditional>
+ </repeat>
+ <repeat name="input_interval_repeat" title="Operate on Genomic intervals">
+ <param name="input_intervals" type="data" format="bed,gatk_interval,picard_interval_list,vcf" label="Genomic intervals" />
+ </repeat>
+ <repeat name="input_exclude_interval_repeat" title="Exclude Genomic intervals">
+ <param name="input_exclude_intervals" type="data" format="bed,gatk_interval,picard_interval_list,vcf" label="Genomic intervals" />
+ </repeat>
+
+ <param name="interval_set_rule" type="select" label="Interval set rule">
+ <option value="UNION" selected="True">UNION</option>
+ <option value="INTERSECTION">INTERSECTION</option>
+ </param>
+
+ <conditional name="downsampling_type">
+ <param name="downsampling_type_selector" type="select" label="Type of reads downsampling to employ at a given locus" help="Downsampling Type">
+ <option value="NONE" selected="True">NONE</option>
+ <option value="ALL_READS">ALL_READS</option>
+ <option value="BY_SAMPLE">BY_SAMPLE</option>
+ </param>
+ <when value="NONE">
+ <!-- no more options here -->
+ </when>
+ <when value="ALL_READS">
+ <conditional name="downsample_to_type">
+ <param name="downsample_to_type_selector" type="select" label="Type of reads downsampling to employ at a given locus" help="Downsampling Type">
+ <option value="downsample_to_fraction" selected="True">Downsample by Fraction</option>
+ <option value="downsample_to_coverage">Downsample by Coverage</option>
+ </param>
+ <when value="downsample_to_fraction">
+ <param name="downsample_to_value" type="float" label="Fraction [0.0-1.0] of reads to downsample to" value="1" min="0" max="1"/>
+ </when>
+ <when value="downsample_to_coverage">
+ <param name="downsample_to_value" type="integer" label="Coverage to downsample to at any given locus" value="0"/>
+ </when>
+ </conditional>
+ </when>
+ <when value="BY_SAMPLE">
+ <conditional name="downsample_to_type">
+ <param name="downsample_to_type_selector" type="select" label="Type of reads downsampling to employ at a given locus" help="Downsampling Type">
+ <option value="downsample_to_fraction" selected="True">Downsample by Fraction</option>
+ <option value="downsample_to_coverage">Downsample by Coverage</option>
+ </param>
+ <when value="downsample_to_fraction">
+ <param name="downsample_to_value" type="float" label="Fraction [0.0-1.0] of reads to downsample to" value="1" min="0" max="1"/>
+ </when>
+ <when value="downsample_to_coverage">
+ <param name="downsample_to_value" type="integer" label="Coverage to downsample to at any given locus" value="0"/>
+ </when>
+ </conditional>
+ </when>
+ </conditional>
+ <param name="baq" type="select" label="Type of BAQ calculation to apply in the engine">
+ <option value="OFF" selected="True">OFF</option>
+ <option value="CALCULATE_AS_NECESSARY">CALCULATE_AS_NECESSARY</option>
+ <option value="RECALCULATE">RECALCULATE</option>
+ </param>
+ <param name="baq_gap_open_penalty" type="float" label="BAQ gap open penalty (Phred Scaled)" value="40" help="Default value is 40. 30 is perhaps better for whole genome call sets."/>
+ <param name="use_original_qualities" type="boolean" truevalue="--useOriginalQualities" falsevalue="" label="Use the original base quality scores from the OQ tag" />
+ <param name="default_base_qualities" type="integer" label="Value to be used for all base quality scores, when some are missing" value="-1"/>
+ <param name="validation_strictness" type="select" label="How strict should we be with validation">
+ <option value="STRICT" selected="True">STRICT</option>
+ <option value="LENIENT">LENIENT</option>
+ <option value="SILENT">SILENT</option>
+ <!-- <option value="DEFAULT_STRINGENCY">DEFAULT_STRINGENCY</option> listed in docs, but not valid value...-->
+ </param>
+ <param name="interval_merging" type="select" label="Interval merging rule">
+ <option value="ALL" selected="True">ALL</option>
+ <option value="OVERLAPPING_ONLY">OVERLAPPING_ONLY</option>
+ </param>
+
+ <repeat name="read_group_black_list_repeat" title="Read group black list">
+ <conditional name="read_group_black_list_type">
+ <param name="read_group_black_list_type_selector" type="select" label="Type of reads read group black list">
+ <option value="file" selected="True">Filters in file</option>
+ <option value="text">Specify filters as a string</option>
+ </param>
+ <when value="file">
+ <param name="read_group_black_list" type="data" format="txt" label="Read group black list file" />
+ </when>
+ <when value="text">
+ <param name="read_group_black_list" type="text" value="tag:string" label="Read group black list tag:string" />
+ </when>
+ </conditional>
+ </repeat>
+
+ <param name="disable_experimental_low_memory_sharding" type="boolean" truevalue="--disable_experimental_low_memory_sharding" falsevalue="" label="Disable experimental low-memory sharding functionality." checked="False"/>
+ <param name="non_deterministic_random_seed" type="boolean" truevalue="--nonDeterministicRandomSeed" falsevalue="" label="Makes the GATK behave non deterministically, that is, the random numbers generated will be different in every run" checked="False" />
+
+ </when>
+ </conditional>
+
+ </inputs>
+ <outputs>
+ <data format="bam" name="output_bam" label="${tool.name} on ${on_string} (BAM)" />
+ <data format="txt" name="output_log" label="${tool.name} on ${on_string} (log)" />
+ </outputs>
+ <param name="number" type="integer" value="-1" label="Print the first n reads from the file, discarding the rest" />
+ <param name="platform" type="text" value="" label="Exclude all reads with this platform from the output" />
+ <param name="read_group" type="text" value="" label="Exclude all reads with this read group from the output" />
+ <repeat name="sample_file_repeat" title="File containing a list of samples to include">
+ <param name="input_sample_file" type="data" format="text" label="Sample file" />
+ </repeat>
+ <repeat name="sample_name_repeat" title="Sample name to be included in the analysis">
+ <param name="sample_name" type="text" label="Sample name" />
+ </repeat>
+ <tests>
+ <test>
+ <param name="reference_source_selector" value="history" />
+ <param name="ref_file" value="phiX.fasta" ftype="fasta" />
+ <param name="input_bam" value="gatk/gatk_table_recalibration/gatk_table_recalibration_out_1.bam" ftype="bam" />
+ <param name="number" value="-1" />
+ <param name="platform" value="" />
+ <param name="read_group" value="" />
+ <param name="sample_file_repeat" value="0" />
+ <param name="sample_name_repeat" value="0" />
+ <param name="gatk_param_type_selector" value="basic" />
+ <output name="output_bam" file="gatk/gatk_table_recalibration/gatk_table_recalibration_out_1.bam" ftype="bam" />
+ <output name="output_log" file="gatk/gatk_print_reads/gatk_print_reads_out_1.log.contains" compare="contains" />
+ </test>
+ </tests>
+ <help>
+**What it does**
+
+PrintReads can dynamically merge the contents of multiple input BAM files, resulting in merged output sorted in coordinate order.
+
+For more information on the GATK Print Reads Walker, see this `tool specific page <http://www.broadinstitute.org/gsa/gatkdocs/release/org_broadinstitute_sting_gatk_walkers_PrintReadsWalker.html>`_.
+
+To learn about best practices for variant detection using GATK, see this `overview <http://www.broadinstitute.org/gsa/wiki/index.php/Best_Practice_Variant_Detection_with_the_GATK_v3>`_.
+
+If you encounter errors, please view the `GATK FAQ <http://www.broadinstitute.org/gsa/wiki/index.php/Frequently_Asked_Questions>`_.
+
+------
+
+**Inputs**
+
+GenomeAnalysisTK: PrintReads accepts one or more BAM or SAM input files.
+
+
+**Outputs**
+
+The output is in BAM format.
+
+
+Go `here <http://www.broadinstitute.org/gsa/wiki/index.php/Input_files_for_the_GATK>`_ for details on GATK file formats.
+
+-------
+
+**Settings**::
+
+ number int -1 Print the first n reads from the file, discarding the rest
+ platform String NA Exclude all reads with this platform from the output
+ readGroup String NA Exclude all reads with this read group from the output
+ sample_file Set[File] [] File containing a list of samples (one per line). Can be specified multiple times
+ sample_name Set[String] [] Sample name to be included in the analysis. Can be specified multiple times.
+
+------
+
+**Citation**
+
+For the underlying tool, please cite `DePristo MA, Banks E, Poplin R, Garimella KV, Maguire JR, Hartl C, Philippakis AA, del Angel G, Rivas MA, Hanna M, McKenna A, Fennell TJ, Kernytsky AM, Sivachenko AY, Cibulskis K, Gabriel SB, Altshuler D, Daly MJ. A framework for variation discovery and genotyping using next-generation DNA sequencing data. Nat Genet. 2011 May;43(5):491-8. <http://www.ncbi.nlm.nih.gov/pubmed/21478889>`_
+
+If you use this tool in Galaxy, please cite Blankenberg D, et al. *In preparation.*
+
+ </help>
+</tool>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Enhance Picard SamToFastq to allow read group aware processing.
by Bitbucket 24 Jan '12
by Bitbucket 24 Jan '12
24 Jan '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/82797507cbc9/
changeset: 82797507cbc9
user: dan
date: 2012-01-24 21:03:35
summary: Enhance Picard SamToFastq to allow read group aware processing.
affected #: 2 files
diff -r 226c3216eaf8565c215032a065018789f8f78bd8 -r 82797507cbc95c5a5123bdb0a77bce7bb0e81727 tools/picard/picard_SamToFastq.xml
--- a/tools/picard/picard_SamToFastq.xml
+++ b/tools/picard/picard_SamToFastq.xml
@@ -1,10 +1,13 @@
-<tool id="picard_SamToFastq" name="SAM to FASTQ" version="1.56.0">
+<tool id="picard_SamToFastq" name="SAM to FASTQ" version="1.56.1" force_history_refresh="True"><description>creates a FASTQ file</description><requirements><requirement type="package" version="1.56.0">picard</requirement></requirements><!-- Dan Blankenberg -->
- <command>java -XX:DefaultMaxRAMFraction=1 -XX:+UseParallelGC
+ <command interpreter="python">picard_SamToFastq_wrapper.py
+ -p '
+ java -XX:DefaultMaxRAMFraction=1 -XX:+UseParallelGC
-jar "${GALAXY_DATA_INDEX_DIR}/shared/jars/picard/SamToFastq.jar"
INPUT="${input_sam}"
+ VALIDATION_STRINGENCY="LENIENT"
RE_REVERSE=${re_reverse}
INCLUDE_NON_PF_READS=${include_non_pf_reads}
#if str( $clipping_attribute ):
@@ -34,13 +37,26 @@
READ2_MAX_BASES_TO_WRITE="${single_paired_end_type.read2_max_bases_to_write}"
#end if
#end if
+ '
#else:
- #raise Exception( 'Per Read Group not yet supported.' )
OUTPUT_PER_RG=true
- OUTPUT_DIR="./picard_sam_to_fastq_tmp_dir/"
+ #if str( $single_paired_end_type.single_paired_end_type_selector ) == 'paired':
+ '
+ --read_group_file_2 "${output_fastq2}"
+ --file_id_2 "${output_fastq2.id}"
+ -p '
+ #if str( $single_paired_end_type.read2_trim ):
+ READ2_TRIM="${single_paired_end_type.read2_trim}"
+ #end if
+ #if str( $single_paired_end_type.read2_max_bases_to_write ):
+ READ2_MAX_BASES_TO_WRITE="${single_paired_end_type.read2_max_bases_to_write}"
+ #end if
+ #end if
+ '
+ --read_group_file_1 "${output_fastq1}"
+ --new_files_path "${$__new_file_path__}"
+ --file_id_1 "${output_fastq1.id}"
#end if
- 2>&1
- || echo "Error running SamToFastq" >&2
</command><inputs><param name="input_sam" type="data" format="sam,bam" label="BAM/SAM file" />
@@ -48,8 +64,7 @@
<param name="read1_max_bases_to_write" type="integer" optional="True" value="" label="The maximum number of bases to write from read 1 after trimming." /><param name="output_per_read_group_selector" type="select" label="Output per read group"><option value="per_sam_file" selected="True">Per BAM/SAM file</option>
- <!-- <option value="per_read_group">Per Read Group</option> -->
- <validator type="expression" message="Per Read Group selection is not yet implemented">value == 'per_sam_file'</validator>
+ <option value="per_read_group">Per Read Group</option></param><conditional name="single_paired_end_type"><param name="single_paired_end_type_selector" type="select" label="Single or Paired end">
@@ -107,6 +122,22 @@
<output name="output_fastq1" file="bwa_wrapper_in2.fastqsanger" lines_diff="64"/><!-- 16 unaligned fastq blocks not present in original sam file --><output name="output_fastq2" file="bwa_wrapper_in3.fastqsanger" lines_diff="64"/><!-- 16 unaligned fastq blocks not present in original sam file --></test>
+ <test>
+ <param name="input_sam" value="bwa_wrapper_out3.sam" ftype="sam" />
+ <param name="output_per_read_group_selector" value="per_read_group" />
+ <param name="single_paired_end_type_selector" value="paired" />
+ <param name="read1_trim" value="" />
+ <param name="read1_max_bases_to_write" value="" />
+ <param name="read2_trim" value="" />
+ <param name="read2_max_bases_to_write" value="" />
+ <param name="re_reverse" value="True" />
+ <param name="include_non_pf_reads" value="False" />
+ <param name="clipping_action" value="" />
+ <param name="clipping_attribute" value="" />
+ <param name="include_non_primary_alignments" value="False" />
+ <output name="output_fastq1" file="bwa_wrapper_in2.fastqsanger" lines_diff="64"/><!-- 16 unaligned fastq blocks not present in original sam file -->
+ <output name="output_fastq2" file="bwa_wrapper_in3.fastqsanger" lines_diff="64"/><!-- 16 unaligned fastq blocks not present in original sam file -->
+ </test></tests><help>
**What it does**
diff -r 226c3216eaf8565c215032a065018789f8f78bd8 -r 82797507cbc95c5a5123bdb0a77bce7bb0e81727 tools/picard/picard_SamToFastq_wrapper.py
--- /dev/null
+++ b/tools/picard/picard_SamToFastq_wrapper.py
@@ -0,0 +1,93 @@
+#!/usr/bin/env python
+#Dan Blankenberg
+
+"""
+A wrapper script for running the Picard SamToFastq command. Allows parsing read groups into separate files.
+"""
+
+import sys, optparse, os, tempfile, subprocess, shutil
+
+CHUNK_SIZE = 2**20 #1mb
+
+
+def cleanup_before_exit( tmp_dir ):
+ if tmp_dir and os.path.exists( tmp_dir ):
+ shutil.rmtree( tmp_dir )
+
+def open_file_from_option( filename, mode = 'rb' ):
+ if filename:
+ return open( filename, mode = mode )
+ return None
+
+def __main__():
+ #Parse Command Line
+ parser = optparse.OptionParser()
+ parser.add_option( '-p', '--pass_through', dest='pass_through_options', action='append', type="string", help='These options are passed through directly to PICARD, without any modification.' )
+ parser.add_option( '-1', '--read_group_file_1', dest='read_group_file_1', action='store', type="string", default=None, help='Read Group 1 output file, when using multiple readgroups' )
+ parser.add_option( '-2', '--read_group_file_2', dest='read_group_file_2', action='store', type="string", default=None, help='Read Group 2 output file, when using multiple readgroups and paired end' )
+ parser.add_option( '', '--stdout', dest='stdout', action='store', type="string", default=None, help='If specified, the output of stdout will be written to this file.' )
+ parser.add_option( '', '--stderr', dest='stderr', action='store', type="string", default=None, help='If specified, the output of stderr will be written to this file.' )
+ parser.add_option( '-n', '--new_files_path', dest='new_files_path', action='store', type="string", default=None, help='new_files_path')
+ parser.add_option( '-i', '--file_id_1', dest='file_id_1', action='store', type="string", default=None, help='file_id_1')
+ parser.add_option( '-f', '--file_id_2', dest='file_id_2', action='store', type="string", default=None, help='file_id_2')
+ (options, args) = parser.parse_args()
+
+ tmp_dir = tempfile.mkdtemp( prefix='tmp-picard-' )
+ if options.pass_through_options:
+ cmd = ' '.join( options.pass_through_options )
+ else:
+ cmd = ''
+ if options.new_files_path is not None:
+ print 'Creating FASTQ files by Read Group'
+ assert None not in [ options.read_group_file_1, options.new_files_path, options.file_id_1 ], 'When using read group aware, you need to specify --read_group_file_1, --read_group_file_2 (when paired end), --new_files_path, and --file_id'
+ cmd = '%s OUTPUT_DIR="%s"' % ( cmd, tmp_dir)
+ #set up stdout and stderr output options
+ stdout = open_file_from_option( options.stdout, mode = 'wb' )
+ if stdout is None:
+ stdout = sys.stdout
+ stderr = open_file_from_option( options.stderr, mode = 'wb' )
+ #if no stderr file is specified, we'll use our own
+ if stderr is None:
+ stderr = tempfile.NamedTemporaryFile( prefix="picard-stderr-", dir=tmp_dir )
+
+ proc = subprocess.Popen( args=cmd, stdout=stdout, stderr=stderr, shell=True, cwd=tmp_dir )
+ return_code = proc.wait()
+
+ if return_code:
+ stderr_target = sys.stderr
+ else:
+ stderr_target = sys.stdout
+ stderr.flush()
+ stderr.seek(0)
+ while True:
+ chunk = stderr.read( CHUNK_SIZE )
+ if chunk:
+ stderr_target.write( chunk )
+ else:
+ break
+ stderr.close()
+ #if rg aware, put files where they belong
+ if options.new_files_path is not None:
+ fastq_1_name = options.read_group_file_1
+ fastq_2_name = options.read_group_file_2
+ file_id_1 = options.file_id_1
+ file_id_2 = options.file_id_2
+ if file_id_2 is None:
+ file_id_2 = file_id_1
+ for filename in sorted( os.listdir( tmp_dir ) ):
+ if filename.endswith( '_1.fastq' ):
+ if fastq_1_name:
+ shutil.move( os.path.join( tmp_dir, filename ), fastq_1_name )
+ fastq_1_name = None
+ else:
+ shutil.move( os.path.join( tmp_dir, filename ), os.path.join( options.new_files_path, 'primary_%s_%s - 1_visible_fastq' % ( file_id_1, filename[:-len( '_1.fastq' )] ) ) )
+ elif filename.endswith( '_2.fastq' ):
+ if fastq_2_name:
+ shutil.move( os.path.join( tmp_dir, filename ), fastq_2_name )
+ fastq_2_name = None
+ else:
+ shutil.move( os.path.join( tmp_dir, filename ), os.path.join( options.new_files_path, 'primary_%s_%s - 2_visible_fastq' % ( file_id_2, filename[:-len( '_2.fastq' )] ) ) )
+
+ cleanup_before_exit( tmp_dir )
+
+if __name__=="__main__": __main__()
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0