galaxy-commits
Threads by month
- ----- 2025 -----
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- 15302 discussions

commit/galaxy-central: dan: GATK tools will now use their own .loc file for picard indexes and will also load annotations from an external file.
by Bitbucket 05 Dec '11
by Bitbucket 05 Dec '11
05 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/46e455a590e5/
changeset: 46e455a590e5
user: dan
date: 2011-12-05 18:27:54
summary: GATK tools will now use their own .loc file for picard indexes and will also load annotations from an external file.
affected #: 5 files
diff -r 943855e11c853ec7c8ab7a64ee07a65daefe4374 -r 46e455a590e5923db81cda0034d6192503ce29a6 tool-data/gatk_annotations.txt.sample
--- /dev/null
+++ b/tool-data/gatk_annotations.txt.sample
@@ -0,0 +1,30 @@
+#unique_id name gatk_value tools_valid_for
+AlleleBalance AlleleBalance AlleleBalance UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+AlleleBalanceBySample AlleleBalanceBySample AlleleBalanceBySample UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+BaseCounts BaseCounts BaseCounts UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+BaseQualityRankSumTest BaseQualityRankSumTest BaseQualityRankSumTest UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+ChromosomeCounts ChromosomeCounts ChromosomeCounts UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+DepthOfCoverage DepthOfCoverage DepthOfCoverage UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+DepthPerAlleleBySample DepthPerAlleleBySample DepthPerAlleleBySample UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+FisherStrand FisherStrand FisherStrand UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+GCContent GCContent GCContent UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+HaplotypeScore HaplotypeScore HaplotypeScore UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+HardyWeinberg HardyWeinberg HardyWeinberg UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+HomopolymerRun HomopolymerRun HomopolymerRun UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+InbreedingCoeff InbreedingCoeff InbreedingCoeff UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+IndelType IndelType IndelType UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+LowMQ LowMQ LowMQ UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+MVLikelihoodRatio MVLikelihoodRatio MVLikelihoodRatio UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+MappingQualityRankSumTest MappingQualityRankSumTest MappingQualityRankSumTest UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+MappingQualityZero MappingQualityZero MappingQualityZero UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+MappingQualityZeroBySample MappingQualityZeroBySample MappingQualityZeroBySample UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+MappingQualityZeroFraction MappingQualityZeroFraction MappingQualityZeroFraction UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+NBaseCount NBaseCount NBaseCount UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+QualByDepth QualByDepth QualByDepth UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+RMSMappingQuality RMSMappingQuality RMSMappingQuality UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+ReadDepthAndAllelicFractionBySample ReadDepthAndAllelicFractionBySample ReadDepthAndAllelicFractionBySample UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+ReadPosRankSumTest ReadPosRankSumTest ReadPosRankSumTest UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+SampleList SampleList SampleList UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+SnpEff SnpEff SnpEff VariantAnnotator,VariantRecalibrator
+SpanningDeletions SpanningDeletions SpanningDeletions UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
+TechnologyComposition TechnologyComposition TechnologyComposition UnifiedGenotyper,VariantAnnotator,VariantRecalibrator
diff -r 943855e11c853ec7c8ab7a64ee07a65daefe4374 -r 46e455a590e5923db81cda0034d6192503ce29a6 tool-data/gatk_sorted_picard_index.loc.sample
--- /dev/null
+++ b/tool-data/gatk_sorted_picard_index.loc.sample
@@ -0,0 +1,26 @@
+#This is a sample file distributed with Galaxy that enables tools
+#to use a directory of Picard dict and associated files. You will need
+#to create these data files and then create a picard_index.loc file
+#similar to this one (store it in this directory) that points to
+#the directories in which those files are stored. The picard_index.loc
+#file has this format (longer white space is the TAB character):
+#
+#<unique_build_id><dbkey><display_name><fasta_file_path>
+#
+#So, for example, if you had hg18 indexed and stored in
+#/depot/data2/galaxy/srma/hg18/,
+#then the srma_index.loc entry would look like this:
+#
+#hg18 hg18 hg18 Pretty /depot/data2/galaxy/picard/hg18/hg18.fa
+#
+#and your /depot/data2/galaxy/srma/hg18/ directory
+#would contain the following three files:
+#hg18.fa
+#hg18.dict
+#hg18.fa.fai
+#
+#The dictionary file for each reference (ex. hg18.dict) must be
+#created via Picard (http://picard.sourceforge.net) Note that
+#the dict file does not have the .fa extension although the
+#path list in the loc file does include it.
+#
diff -r 943855e11c853ec7c8ab7a64ee07a65daefe4374 -r 46e455a590e5923db81cda0034d6192503ce29a6 tools/gatk/unified_genotyper.xml
--- a/tools/gatk/unified_genotyper.xml
+++ b/tools/gatk/unified_genotyper.xml
@@ -113,7 +113,7 @@
--indelHaplotypeSize "${analysis_param_type.indelHaplotypeSize}"
${analysis_param_type.doContextDependentGapPenalties}
#if str( $analysis_param_type.annotation ) != "None":
- #for $annotation in str( $analysis_param_type.annotation ).split( ','):
+ #for $annotation in str( $analysis_param_type.annotation.fields.gatk_value ).split( ','):
--annotation "${annotation}"
#end for
#end if
@@ -123,7 +123,7 @@
#end for
#end if
#if str( $analysis_param_type.exclude_annotations ) != "None":
- #for $annotation in str( $analysis_param_type.exclude_annotations ).split( ','):
+ #for $annotation in str( $analysis_param_type.exclude_annotations.fields.gatk_value ).split( ','):
--excludeAnnotation "${annotation}"
#end for
#end if
@@ -366,35 +366,11 @@
<param name="indelHaplotypeSize" type="integer" value="80" label="Indel haplotype size" /><param name="doContextDependentGapPenalties" type="boolean" truevalue="--doContextDependentGapPenalties" falsevalue="" label="Vary gap penalties by context" /><param name="annotation" type="select" multiple="True" display="checkboxes" label="Annotation Types">
- <option value="AlleleBalance" />
- <option value="AlleleBalanceBySample" />
- <option value="BaseCounts" />
- <option value="BaseQualityRankSumTest" />
- <option value="ChromosomeCounts" />
- <option value="DepthOfCoverage" />
- <option value="DepthPerAlleleBySample" />
- <option value="FisherStrand" />
- <option value="GCContent" />
- <option value="HaplotypeScore" />
- <option value="HardyWeinberg" />
- <option value="HomopolymerRun" />
- <option value="InbreedingCoeff" />
- <option value="IndelType" />
- <option value="LowMQ" />
- <option value="MVLikelihoodRatio" />
- <option value="MappingQualityRankSumTest" />
- <option value="MappingQualityZero" />
- <option value="MappingQualityZeroBySample" />
- <option value="MappingQualityZeroFraction" />
- <option value="NBaseCount" />
- <option value="QualByDepth" />
- <option value="RMSMappingQuality" />
- <option value="ReadDepthAndAllelicFractionBySample" />
- <option value="ReadPosRankSumTest" />
- <option value="SampleList" />
- <!-- <option value="SnpEff" /> -->
- <option value="SpanningDeletions" />
- <option value="TechnologyComposition" />
+ <!-- load the available annotations from an external configuration file, since additional ones can be added to local installs -->
+ <options from_data_table="gatk_annotations">
+ <filter type="multiple_splitter" column="tools_valid_for" separator=","/>
+ <filter type="static_value" value="UnifiedGenotyper" column="tools_valid_for"/>
+ </options></param><!--
<conditional name="snpEff_rod_bind_type">
@@ -420,36 +396,11 @@
</param><!-- <param name="family_string" type="text" value="" label="Family String"/> --><param name="exclude_annotations" type="select" multiple="True" display="checkboxes" label="Annotations to exclude" >
- <!-- might we want to load the available annotations from an external configuration file, since additional ones can be added to local installs? -->
- <option value="AlleleBalance" />
- <option value="AlleleBalanceBySample" />
- <option value="BaseCounts" />
- <option value="BaseQualityRankSumTest" />
- <option value="ChromosomeCounts" />
- <option value="DepthOfCoverage" />
- <option value="DepthPerAlleleBySample" />
- <option value="FisherStrand" />
- <option value="GCContent" />
- <option value="HaplotypeScore" />
- <option value="HardyWeinberg" />
- <option value="HomopolymerRun" />
- <option value="InbreedingCoeff" />
- <option value="IndelType" />
- <option value="LowMQ" />
- <option value="MVLikelihoodRatio" />
- <option value="MappingQualityRankSumTest" />
- <option value="MappingQualityZero" />
- <option value="MappingQualityZeroBySample" />
- <option value="MappingQualityZeroFraction" />
- <option value="NBaseCount" />
- <option value="QualByDepth" />
- <option value="RMSMappingQuality" />
- <option value="ReadDepthAndAllelicFractionBySample" />
- <option value="ReadPosRankSumTest" />
- <option value="SampleList" />
- <!-- <option value="SnpEff" /> -->
- <option value="SpanningDeletions" />
- <option value="TechnologyComposition" />
+ <!-- load the available annotations from an external configuration file, since additional ones can be added to local installs -->
+ <options from_data_table="gatk_annotations">
+ <filter type="multiple_splitter" column="tools_valid_for" separator=","/>
+ <filter type="static_value" value="UnifiedGenotyper" column="tools_valid_for"/>
+ </options></param></when></conditional>
diff -r 943855e11c853ec7c8ab7a64ee07a65daefe4374 -r 46e455a590e5923db81cda0034d6192503ce29a6 tools/gatk/variant_annotator.xml
--- a/tools/gatk/variant_annotator.xml
+++ b/tools/gatk/variant_annotator.xml
@@ -26,13 +26,13 @@
--useAllAnnotations
#else:
#if str( $annotations_type.annotations ) != "None":
- #for $annotation in str( $annotations_type.annotations ).split( ',' ):
+ #for $annotation in str( $annotations_type.annotations.fields.gatk_value ).split( ',' ):
--annotation "${annotation}"
#end for
#end if
#end if
#if str( $exclude_annotations ) != "None":
- #for $annotation in str( $exclude_annotations ).split( ',' ):
+ #for $annotation in str( $exclude_annotations.fields.gatk_value ).split( ',' ):
--excludeAnnotation "${annotation}"
#end for
#end if
@@ -179,36 +179,11 @@
</when><when value="choose"><param name="annotations" type="select" multiple="True" display="checkboxes" label="Annotations to apply" >
- <!-- might we want to load the available annotations from an external configuration file, since additional ones can be added to local installs? -->
- <option value="AlleleBalance" />
- <option value="AlleleBalanceBySample" />
- <option value="BaseCounts" />
- <option value="BaseQualityRankSumTest" />
- <option value="ChromosomeCounts" />
- <option value="DepthOfCoverage" />
- <option value="DepthPerAlleleBySample" />
- <option value="FisherStrand" />
- <option value="GCContent" />
- <option value="HaplotypeScore" />
- <option value="HardyWeinberg" />
- <option value="HomopolymerRun" />
- <option value="InbreedingCoeff" />
- <option value="IndelType" />
- <option value="LowMQ" />
- <option value="MVLikelihoodRatio" />
- <option value="MappingQualityRankSumTest" />
- <option value="MappingQualityZero" />
- <option value="MappingQualityZeroBySample" />
- <option value="MappingQualityZeroFraction" />
- <option value="NBaseCount" />
- <option value="QualByDepth" />
- <option value="RMSMappingQuality" />
- <option value="ReadDepthAndAllelicFractionBySample" />
- <option value="ReadPosRankSumTest" />
- <option value="SampleList" />
- <option value="SnpEff" />
- <option value="SpanningDeletions" />
- <option value="TechnologyComposition" />
+ <!-- load the available annotations from an external configuration file, since additional ones can be added to local installs -->
+ <options from_data_table="gatk_annotations">
+ <filter type="multiple_splitter" column="tools_valid_for" separator=","/>
+ <filter type="static_value" value="VariantAnnotator" column="tools_valid_for"/>
+ </options></param></when></conditional>
@@ -388,36 +363,11 @@
<param name="family_string" type="text" value="" label="Family String"/><param name="mendel_violation_genotype_quality_threshold" type="float" value="0.0" label="genotype quality treshold in order to annotate mendelian violation ratio."/><param name="exclude_annotations" type="select" multiple="True" display="checkboxes" label="Annotations to exclude" >
- <!-- might we want to load the available annotations from an external configuration file, since additional ones can be added to local installs? -->
- <option value="AlleleBalance" />
- <option value="AlleleBalanceBySample" />
- <option value="BaseCounts" />
- <option value="BaseQualityRankSumTest" />
- <option value="ChromosomeCounts" />
- <option value="DepthOfCoverage" />
- <option value="DepthPerAlleleBySample" />
- <option value="FisherStrand" />
- <option value="GCContent" />
- <option value="HaplotypeScore" />
- <option value="HardyWeinberg" />
- <option value="HomopolymerRun" />
- <option value="InbreedingCoeff" />
- <option value="IndelType" />
- <option value="LowMQ" />
- <option value="MVLikelihoodRatio" />
- <option value="MappingQualityRankSumTest" />
- <option value="MappingQualityZero" />
- <option value="MappingQualityZeroBySample" />
- <option value="MappingQualityZeroFraction" />
- <option value="NBaseCount" />
- <option value="QualByDepth" />
- <option value="RMSMappingQuality" />
- <option value="ReadDepthAndAllelicFractionBySample" />
- <option value="ReadPosRankSumTest" />
- <option value="SampleList" />
- <option value="SnpEff" />
- <option value="SpanningDeletions" />
- <option value="TechnologyComposition" />
+ <!-- load the available annotations from an external configuration file, since additional ones can be added to local installs -->
+ <options from_data_table="gatk_annotations">
+ <filter type="multiple_splitter" column="tools_valid_for" separator=","/>
+ <filter type="static_value" value="VariantAnnotator" column="tools_valid_for"/>
+ </options></param></inputs>
diff -r 943855e11c853ec7c8ab7a64ee07a65daefe4374 -r 46e455a590e5923db81cda0034d6192503ce29a6 tools/gatk/variant_recalibrator.xml
--- a/tools/gatk/variant_recalibrator.xml
+++ b/tools/gatk/variant_recalibrator.xml
@@ -100,7 +100,7 @@
##start analysis specific options
-p '
#if str( $annotations ) != "None":
- #for $annotation in str( $annotations ).split( ',' ):
+ #for $annotation in str( $annotations.fields.gatk_value ).split( ',' ):
--use_annotation "${annotation}"
#end for
#end if
@@ -337,36 +337,11 @@
</repeat><param name="annotations" type="select" multiple="True" display="checkboxes" label="annotations which should used for calculations">
- <!-- might we want to load the available annotations from an external configuration file, since additional ones can be added to local installs? -->
- <option value="ChromosomeCounts"/>
- <option value="IndelType"/>
- <option value="SpanningDeletions"/>
- <option value="HardyWeinberg"/>
- <option value="NBaseCount"/>
- <option value="MappingQualityZero"/>
- <option value="AlleleBalance"/>
- <option value="BaseCounts"/>
- <option value="LowMQ"/>
- <option value="InbreedingCoeff"/>
- <option value="RMSMappingQuality"/>
- <option value="HaplotypeScore"/>
- <option value="TechnologyComposition"/>
- <option value="SampleList"/>
- <option value="FisherStrand"/>
- <option value="HomopolymerRun"/>
- <option value="DepthOfCoverage"/>
- <option value="SnpEff"/>
- <option value="MappingQualityZeroFraction"/>
- <option value="GCContent"/>
- <option value="MappingQualityRankSumTest"/>
- <option value="ReadPosRankSumTest"/>
- <option value="BaseQualityRankSumTest"/>
- <option value="QualByDepth"/>
- <option value="SBByDepth"/>
- <option value="ReadDepthAndAllelicFractionBySample"/>
- <option value="AlleleBalanceBySample"/>
- <option value="DepthPerAlleleBySample"/>
- <option value="MappingQualityZeroBySample"/>
+ <!-- load the available annotations from an external configuration file, since additional ones can be added to local installs -->
+ <options from_data_table="gatk_annotations">
+ <filter type="multiple_splitter" column="tools_valid_for" separator=","/>
+ <filter type="static_value" value="VariantRecalibrator" column="tools_valid_for"/>
+ </options></param><param name="mode" type="select" label="Recalibration mode">
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Look for datasets/indices in metadata and use if found rather than doing conversion.
by Bitbucket 05 Dec '11
by Bitbucket 05 Dec '11
05 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/943855e11c85/
changeset: 943855e11c85
user: jgoecks
date: 2011-12-05 18:19:48
summary: Look for datasets/indices in metadata and use if found rather than doing conversion.
affected #: 2 files
diff -r 5e67b3ddbda241bf8a24969ac6caa47e9cbe5d9b -r 943855e11c853ec7c8ab7a64ee07a65daefe4374 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -928,7 +928,10 @@
depends_list = trans.app.datatypes_registry.converter_deps[self.extension][target_ext]
except KeyError:
depends_list = []
- # See if converted dataset already exists
+ # See if converted dataset already exists, either in metadata in conversions.
+ converted_dataset = self.get_metadata_dataset( trans, target_ext )
+ if converted_dataset:
+ return converted_dataset
converted_dataset = self.get_converted_files_by_type( target_ext )
if converted_dataset:
return converted_dataset
@@ -958,6 +961,21 @@
session.add( assoc )
session.flush()
return None
+ def get_metadata_dataset( self, trans, dataset_ext ):
+ """
+ Returns an HDA that points to a metadata file which contains a
+ converted data with the requested extension.
+ """
+ for name, value in self.metadata.items():
+ # HACK: MetadataFile objects do not have a type/ext, so need to use metadata name
+ # to determine type.
+ if dataset_ext == 'bai' and name == 'bam_index' and isinstance( value, trans.app.model.MetadataFile ):
+ # HACK: MetadataFile objects cannot be used by tools, so return
+ # a fake HDA that points to metadata file.
+ fake_dataset = trans.app.model.Dataset( state=trans.app.model.Dataset.states.OK,
+ external_filename=value.file_name )
+ fake_hda = trans.app.model.HistoryDatasetAssociation( dataset=fake_dataset )
+ return fake_hda
def clear_associated_files( self, metadata_safe = False, purge = False ):
raise 'Unimplemented'
def get_child_by_designation(self, designation):
diff -r 5e67b3ddbda241bf8a24969ac6caa47e9cbe5d9b -r 943855e11c853ec7c8ab7a64ee07a65daefe4374 lib/galaxy/web/controllers/tracks.py
--- a/lib/galaxy/web/controllers/tracks.py
+++ b/lib/galaxy/web/controllers/tracks.py
@@ -546,7 +546,7 @@
valid_chroms = indexer.valid_chroms()
else:
# Standalone data provider
- standalone_provider = get_data_provider(data_sources['data_standalone']['name'])( dataset )
+ standalone_provider = get_data_provider( data_sources['data_standalone']['name'] )( dataset )
kwargs = {"stats": True}
if not standalone_provider.has_data( chrom ):
return messages.NO_DATA
@@ -609,16 +609,8 @@
else:
tracks_dataset_type = data_sources['data']['name']
data_provider_class = get_data_provider( name=tracks_dataset_type, original_dataset=dataset )
- # HACK: Use bai from bam HDA's metadata if available. This saves
- # the client from waiting a long time to generate a duplicate
- # bam via a converted dataset.
- if dataset.ext == "bam" and dataset.metadata.get( "bam_index", None ) is not None:
- converted_dataset = dataset.metadata.bam_index
- deps = None
- else:
- # Default behavior.
- converted_dataset = dataset.get_converted_dataset( trans, tracks_dataset_type )
- deps = dataset.get_converted_dataset_deps( trans, tracks_dataset_type )
+ converted_dataset = dataset.get_converted_dataset( trans, tracks_dataset_type )
+ deps = dataset.get_converted_dataset_deps( trans, tracks_dataset_type )
data_provider = data_provider_class( converted_dataset=converted_dataset, original_dataset=dataset, dependencies=deps )
# Get and return data from data_provider.
@@ -1037,16 +1029,15 @@
data_sources_dict[ source_type ] = { "name" : data_source, "message": msg }
return data_sources_dict
-
+
def _convert_dataset( self, trans, dataset, target_type ):
"""
Converts a dataset to the target_type and returns a message indicating
status of the conversion. None is returned to indicate that dataset
was converted successfully.
"""
-
- # Get converted dataset; this will start the conversion if
- # necessary.
+
+ # Get converted dataset; this will start the conversion if necessary.
try:
converted_dataset = dataset.get_converted_dataset( trans, target_type )
except NoConverterException:
@@ -1063,7 +1054,7 @@
msg = { 'kind': messages.ERROR, 'message': job.stderr }
elif not converted_dataset or converted_dataset.state != model.Dataset.states.OK:
msg = messages.PENDING
-
+
return msg
def _get_highest_priority_msg( message_list ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: dan: Allow select_param.fields.name method of accessing additional attributes from dynamic options work for multiple selects.
by Bitbucket 05 Dec '11
by Bitbucket 05 Dec '11
05 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/5e67b3ddbda2/
changeset: 5e67b3ddbda2
user: dan
date: 2011-12-05 17:37:46
summary: Allow select_param.fields.name method of accessing additional attributes from dynamic options work for multiple selects.
affected #: 2 files
diff -r 52de9815a7c4438c03daaa478df29c037e18f2a1 -r 5e67b3ddbda241bf8a24969ac6caa47e9cbe5d9b lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -2190,11 +2190,14 @@
Only applicable for dynamic_options selects, which have more than simple 'options' defined (name, value, selected).
"""
def __init__( self, input, value, other_values ):
- self.input = input
- self.value = value
- self.other_values = other_values
+ self._input = input
+ self._value = value
+ self._other_values = other_values
+ self._fields = {}
def __getattr__( self, name ):
- return self.input.options.get_field_by_name_for_value( name, self.value, None, self.other_values )
+ if name not in self._fields:
+ self._fields[ name ] = self._input.options.get_field_by_name_for_value( name, self._value, None, self._other_values )
+ return self._input.separator.join( map( str, self._fields[ name ] ) )
def __init__( self, input, value, app, other_values={} ):
self.input = input
diff -r 52de9815a7c4438c03daaa478df29c037e18f2a1 -r 5e67b3ddbda241bf8a24969ac6caa47e9cbe5d9b lib/galaxy/tools/parameters/dynamic_options.py
--- a/lib/galaxy/tools/parameters/dynamic_options.py
+++ b/lib/galaxy/tools/parameters/dynamic_options.py
@@ -525,13 +525,18 @@
"""
Get contents of field by name for specified value.
"""
+ rval = []
if isinstance( field_name, int ):
field_index = field_name
else:
assert field_name in self.columns, "Requested '%s' column missing from column def" % field_name
field_index = self.columns[ field_name ]
- for fields in self.get_fields_by_value( value, trans, other_values ):
- return fields[ field_index ]
+ if not isinstance( value, list ):
+ value = [value]
+ for val in value:
+ for fields in self.get_fields_by_value( val, trans, other_values ):
+ rval.append( fields[ field_index ] )
+ return rval
def get_options( self, trans, other_values ):
rval = []
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: dannon: Fix for workflow multiple input regression from 6185:07f3f601d645.
by Bitbucket 05 Dec '11
by Bitbucket 05 Dec '11
05 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/52de9815a7c4/
changeset: 52de9815a7c4
user: dannon
date: 2011-12-05 17:29:31
summary: Fix for workflow multiple input regression from 6185:07f3f601d645.
affected #: 1 file
diff -r e69c868b6891ccf6deabcba8a767cdf31593d1be -r 52de9815a7c4438c03daaa478df29c037e18f2a1 lib/galaxy/web/controllers/workflow.py
--- a/lib/galaxy/web/controllers/workflow.py
+++ b/lib/galaxy/web/controllers/workflow.py
@@ -1432,7 +1432,7 @@
step.upgrade_messages = {}
# Connections by input name
step.input_connections_by_name = \
- dict( ( conn.input_name, conn ) for conn in step.input_connections )
+ dict( ( conn.input_name, conn ) for conn in step.input_connections )
# Extract just the arguments for this step by prefix
p = "%s|" % step.id
l = len(p)
@@ -1446,7 +1446,7 @@
has_upgrade_messages = True
# Any connected input needs to have value DummyDataset (these
# are not persisted so we need to do it every time)
- module.add_dummy_datasets( connections=step.input_connections )
+ module.add_dummy_datasets( connections=step.input_connections )
# Get the tool
tool = module.tool
# Get the state
@@ -1462,7 +1462,7 @@
state = step.state = module.decode_runtime_state( trans, step_args.pop( "tool_state" ) )
step_errors = module.update_runtime_state( trans, state, step_args )
if step_errors:
- errors[step.id] = state.inputs["__errors__"] = step_errors
+ errors[step.id] = state.inputs["__errors__"] = step_errors
if 'run_workflow' in kwargs and not errors:
new_history = None
if 'new_history' in kwargs:
@@ -1486,7 +1486,7 @@
tool = trans.app.toolbox.tools_by_id[ step.tool_id ]
except KeyError, e:
# Handle the case where the workflow requires a tool not available in the local Galaxy instance.
- # The id value of tools installed from a Galaxy tool shed is a guid, but these tool's old_id
+ # The id value of tools installed from a Galaxy tool shed is a guid, but these tool's old_id
# attribute should contain what we're looking for.
for available_tool_id, available_tool in trans.app.toolbox.tools_by_id.items():
if step.tool_id == available_tool.old_id:
@@ -1528,7 +1528,8 @@
invocations.append({'outputs': outputs,
'new_history': new_history})
trans.sa_session.flush()
- return trans.fill_template( "workflow/run_complete.mako",
+ if invocations:
+ return trans.fill_template( "workflow/run_complete.mako",
workflow=stored,
invocations=invocations )
else:
@@ -1549,7 +1550,7 @@
has_upgrade_messages = True
# Any connected input needs to have value DummyDataset (these
# are not persisted so we need to do it every time)
- step.module.add_dummy_datasets( connections=step.input_connections )
+ step.module.add_dummy_datasets( connections=step.input_connections )
# Store state with the step
step.state = step.module.state
# Error dict
@@ -1568,7 +1569,7 @@
# Render the form
stored.annotation = self.get_item_annotation_str( trans.sa_session, trans.user, stored )
return trans.fill_template(
- "workflow/run.mako",
+ "workflow/run.mako",
steps=workflow.steps,
workflow=stored,
has_upgrade_messages=has_upgrade_messages,
@@ -1580,8 +1581,8 @@
finally:
# restore the active history
if saved_history is not None:
- trans.set_history(saved_history)
-
+ trans.set_history(saved_history)
+
def get_item( self, trans, id ):
return self.get_stored_workflow( trans, id )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Fix some broken functional tests and fixes for deleting multiple data libraries and resetting the password for multiple users. Fixes issue # 678.
by Bitbucket 05 Dec '11
by Bitbucket 05 Dec '11
05 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/e69c868b6891/
changeset: e69c868b6891
user: greg
date: 2011-12-05 15:58:36
summary: Fix some broken functional tests and fixes for deleting multiple data libraries and resetting the password for multiple users. Fixes issue # 678.
affected #: 7 files
diff -r 627260f26eba2116bc6bfecf285d7d04d12a912b -r e69c868b6891ccf6deabcba8a767cdf31593d1be lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -5,6 +5,7 @@
from datetime import date, datetime, timedelta
from time import strftime
from galaxy import config, tools, web, util
+from galaxy.util import inflector
from galaxy.util.hash_util import *
from galaxy.util.json import json_fix
from galaxy.web import error, form, url_for
@@ -2030,28 +2031,28 @@
@web.require_admin
def reset_user_password( self, trans, **kwd ):
webapp = kwd.get( 'webapp', 'galaxy' )
- id = kwd.get( 'id', None )
- if not id:
- message = "No user ids received for resetting passwords"
+ user_id = kwd.get( 'id', None )
+ if not user_id:
+ message = "No users received for resetting passwords."
trans.response.send_redirect( web.url_for( controller='admin',
action='users',
webapp=webapp,
message=message,
status='error' ) )
- ids = util.listify( id )
+ user_ids = util.listify( user_id )
if 'reset_user_password_button' in kwd:
message = ''
status = ''
- for user_id in ids:
+ for user_id in user_ids:
user = get_user( trans, user_id )
password = kwd.get( 'password', None )
confirm = kwd.get( 'confirm' , None )
if len( password ) < 6:
- message = "Please use a password of at least 6 characters"
+ message = "Use a password of at least 6 characters."
status = 'error'
break
elif password != confirm:
- message = "Passwords do not match"
+ message = "Passwords do not match."
status = 'error'
break
else:
@@ -2059,18 +2060,18 @@
trans.sa_session.add( user )
trans.sa_session.flush()
if not message and not status:
- message = "Passwords reset for %d users" % len( ids )
+ message = "Passwords reset for %d %s." % ( len( user_ids ), inflector.cond_plural( len( user_ids ), 'user' ) )
status = 'done'
trans.response.send_redirect( web.url_for( controller='admin',
action='users',
webapp=webapp,
message=util.sanitize_text( message ),
status=status ) )
- users = [ get_user( trans, user_id ) for user_id in ids ]
- if len( ids ) > 1:
- id=','.join( id )
+ users = [ get_user( trans, user_id ) for user_id in user_ids ]
+ if len( user_ids ) > 1:
+ user_id = ','.join( user_ids )
return trans.fill_template( '/admin/user/reset_password.mako',
- id=id,
+ id=user_id,
users=users,
password='',
confirm='',
@@ -2435,13 +2436,11 @@
# overwrite it in case it contains stuff proprietary to the local instance.
if not os.path.exists( os.path.join( tool_data_path, loc_file ) ):
shutil.copy( os.path.abspath( filename ), os.path.join( tool_data_path, loc_file ) )
-def get_user( trans, id ):
+def get_user( trans, user_id ):
"""Get a User from the database by id."""
- # Load user from database
- id = trans.security.decode_id( id )
- user = trans.sa_session.query( trans.model.User ).get( id )
+ user = trans.sa_session.query( trans.model.User ).get( trans.security.decode_id( user_id ) )
if not user:
- return trans.show_error_message( "User not found for id (%s)" % str( id ) )
+ return trans.show_error_message( "User not found for id (%s)" % str( user_id ) )
return user
def get_user_by_username( trans, username ):
"""Get a user from the database by username"""
diff -r 627260f26eba2116bc6bfecf285d7d04d12a912b -r e69c868b6891ccf6deabcba8a767cdf31593d1be lib/galaxy/web/controllers/library_common.py
--- a/lib/galaxy/web/controllers/library_common.py
+++ b/lib/galaxy/web/controllers/library_common.py
@@ -2250,6 +2250,7 @@
# contents will be purged. The association between this method and the cleanup_datasets.py script
# enables clean maintenance of libraries and library dataset disk files. This is also why the item_types
# are not any of the associations ( the cleanup_datasets.py script handles everything ).
+ status = kwd.get( 'status', 'done' )
show_deleted = util.string_as_bool( kwd.get( 'show_deleted', False ) )
item_types = { 'library': trans.app.model.Library,
'folder': trans.app.model.LibraryFolder,
@@ -2264,22 +2265,36 @@
item_desc = 'Dataset'
else:
item_desc = item_type.capitalize()
- try:
- library_item = trans.sa_session.query( item_types[ item_type ] ).get( trans.security.decode_id( item_id ) )
- except:
- library_item = None
- if not library_item or not ( is_admin or trans.app.security_agent.can_access_library_item( current_user_roles, library_item, trans.user ) ):
- message = 'Invalid %s id ( %s ) specifield.' % ( item_desc, item_id )
+ library_item_ids = util.listify( item_id )
+ valid_items = 0
+ invalid_items = 0
+ not_authorized_items = 0
+ flush_needed = False
+ message = ''
+ for library_item_id in library_item_ids:
+ try:
+ library_item = trans.sa_session.query( item_types[ item_type ] ).get( trans.security.decode_id( library_item_id ) )
+ except:
+ library_item = None
+ if not library_item or not ( is_admin or trans.app.security_agent.can_access_library_item( current_user_roles, library_item, trans.user ) ):
+ invalid_items += 1
+ elif not ( is_admin or trans.app.security_agent.can_modify_library_item( current_user_roles, library_item ) ):
+ not_authorized_items += 1
+ else:
+ valid_items += 1
+ library_item.deleted = True
+ trans.sa_session.add( library_item )
+ flush_needed = True
+ if flush_needed:
+ trans.sa_session.flush()
+ if valid_items:
+ message += "%d %s marked deleted. " % ( valid_items, inflector.cond_plural( valid_items, item_desc ) )
+ if invalid_items:
+ message += '%d invalid %s specifield. ' % ( invalid_items, inflector.cond_plural( invalid_items, item_desc ) )
status = 'error'
- elif not ( is_admin or trans.app.security_agent.can_modify_library_item( current_user_roles, library_item ) ):
- message = "You are not authorized to delete %s '%s'." % ( item_desc, library_item.name )
+ if not_authorized_items:
+ message += 'You are not authorized to delete %d %s. ' % ( not_authorized_items, inflector.cond_plural( not_authorized_items, item_desc ) )
status = 'error'
- else:
- library_item.deleted = True
- trans.sa_session.add( library_item )
- trans.sa_session.flush()
- message = util.sanitize_text( "%s '%s' has been marked deleted" % ( item_desc, library_item.name ) )
- status = 'done'
if item_type == 'library':
return trans.response.send_redirect( web.url_for( controller=cntrller,
action='browse_libraries',
@@ -2296,6 +2311,7 @@
@web.expose
def undelete_library_item( self, trans, cntrller, library_id, item_id, item_type, **kwd ):
# This action will handle undeleting all types of library items
+ status = kwd.get( 'status', 'done' )
show_deleted = util.string_as_bool( kwd.get( 'show_deleted', False ) )
item_types = { 'library': trans.app.model.Library,
'folder': trans.app.model.LibraryFolder,
@@ -2304,31 +2320,49 @@
current_user_roles = trans.get_current_user_roles()
if item_type not in item_types:
message = 'Bad item_type specified: %s' % str( item_type )
- status = ERROR
+ status = 'error'
else:
if item_type == 'library_dataset':
item_desc = 'Dataset'
else:
item_desc = item_type.capitalize()
- try:
- library_item = trans.sa_session.query( item_types[ item_type ] ).get( trans.security.decode_id( item_id ) )
- except:
- library_item = None
- if not library_item or not ( is_admin or trans.app.security_agent.can_access_library_item( current_user_roles, library_item, trans.user ) ):
- message = 'Invalid %s id ( %s ) specifield.' % ( item_desc, item_id )
+
+ library_item_ids = util.listify( item_id )
+ valid_items = 0
+ invalid_items = 0
+ purged_items = 0
+ not_authorized_items = 0
+ flush_needed = False
+ message = ''
+ for library_item_id in library_item_ids:
+ try:
+ library_item = trans.sa_session.query( item_types[ item_type ] ).get( trans.security.decode_id( library_item_id ) )
+ except:
+ library_item = None
+ if not library_item or not ( is_admin or trans.app.security_agent.can_access_library_item( current_user_roles, library_item, trans.user ) ):
+ invalid_items += 1
+ elif library_item.purged:
+ purged_items += 1
+ elif not ( is_admin or trans.app.security_agent.can_modify_library_item( current_user_roles, library_item ) ):
+ not_authorized_items += 1
+ else:
+ valid_items += 1
+ library_item.deleted = False
+ trans.sa_session.add( library_item )
+ flush_needed = True
+ if flush_needed:
+ trans.sa_session.flush()
+ if valid_items:
+ message += "%d %s marked undeleted. " % ( valid_items, inflector.cond_plural( valid_items, item_desc ) )
+ if invalid_items:
+ message += '%d invalid %s specifield. ' % ( invalid_items, inflector.cond_plural( invalid_items, item_desc ) )
status = 'error'
- elif library_item.purged:
- message = '%s %s has been purged, so it cannot be undeleted' % ( item_desc, library_item.name )
- status = ERROR
- elif not ( is_admin or trans.app.security_agent.can_modify_library_item( current_user_roles, library_item ) ):
- message = "You are not authorized to delete %s '%s'." % ( item_desc, library_item.name )
- status = 'error'
- else:
- library_item.deleted = False
- trans.sa_session.add( library_item )
- trans.sa_session.flush()
- message = util.sanitize_text( "%s '%s' has been marked undeleted" % ( item_desc, library_item.name ) )
- status = SUCCESS
+ if not_authorized_items:
+ message += 'You are not authorized to undelete %d %s. ' % ( not_authorized_items, inflector.cond_plural( not_authorized_items, item_desc ) )
+ status = 'error'
+ if purged_items:
+ message += '%d %s marked purged, so cannot be undeleted. ' % ( purged_items, inflector.cond_plural( purged_items, item_desc ) )
+ status = 'error'
if item_type == 'library':
return trans.response.send_redirect( web.url_for( controller=cntrller,
action='browse_libraries',
diff -r 627260f26eba2116bc6bfecf285d7d04d12a912b -r e69c868b6891ccf6deabcba8a767cdf31593d1be test/base/twilltestcase.py
--- a/test/base/twilltestcase.py
+++ b/test/base/twilltestcase.py
@@ -1224,7 +1224,7 @@
tc.fv( "1", "password", password )
tc.fv( "1", "confirm", password )
tc.submit( "reset_user_password_button" )
- self.check_page_for_string( "Passwords reset for 1 users" )
+ self.check_page_for_string( "Passwords reset for 1 user." )
self.home()
def mark_user_deleted( self, user_id, email='' ):
"""Mark a user as deleted"""
@@ -2278,7 +2278,7 @@
item_desc = 'Dataset'
else:
item_desc = item_type.capitalize()
- check_str = "%s '%s' has been marked deleted" % ( item_desc, item_name )
+ check_str = "marked deleted"
self.check_page_for_string( check_str )
self.home()
def undelete_library_item( self, cntrller, library_id, item_id, item_name, item_type='library_dataset' ):
@@ -2290,7 +2290,7 @@
item_desc = 'Dataset'
else:
item_desc = item_type.capitalize()
- check_str = "%s '%s' has been marked undeleted" % ( item_desc, item_name )
+ check_str = "marked undeleted"
self.check_page_for_string( check_str )
self.home()
def purge_library( self, library_id, library_name ):
diff -r 627260f26eba2116bc6bfecf285d7d04d12a912b -r e69c868b6891ccf6deabcba8a767cdf31593d1be test/functional/test_library_features.py
--- a/test/functional/test_library_features.py
+++ b/test/functional/test_library_features.py
@@ -137,7 +137,7 @@
assert ldda2 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda2 from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ ldda2.name, ldda2.message, admin_user.email ] )
+ strings_displayed=[ ldda2.name, ldda2.message, 'bed' ] )
def test_050_add_2nd_public_dataset_to_folder2( self ):
"""Testing adding a 2nd public dataset folder2"""
# Logged in as admin_user
@@ -156,7 +156,7 @@
assert ldda3 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda3 from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ ldda3.name, ldda3.message, admin_user.email ] )
+ strings_displayed=[ ldda3.name, ldda3.message, 'bed' ] )
def test_055_copy_dataset_from_history_to_subfolder( self ):
"""Testing copying a dataset from the current history to a subfolder"""
# logged in as admin_user
@@ -176,7 +176,7 @@
assert ldda4 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda4 from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ ldda4.name, ldda4.message, admin_user.email ] )
+ strings_displayed=[ ldda4.name, ldda4.message, 'bed' ] )
def test_060_editing_dataset_attribute_info( self ):
"""Testing editing a library dataset's attribute information"""
# logged in as admin_user
@@ -299,11 +299,11 @@
ldda_message = 'Uploaded all files in test-data/users/test1...'
self.browse_library( 'library',
self.security.encode_id( library1.id ),
- strings_displayed=[ regular_user1.email, ldda_message, '1.fasta' ] )
+ strings_displayed=[ 'fasta', ldda_message, '1.fasta' ] )
ldda_message = 'Uploaded all files in test-data/users/test3.../run1'
self.browse_library( 'library',
self.security.encode_id( library1.id ),
- strings_displayed=[ regular_user3.email, ldda_message, '2.fasta' ] )
+ strings_displayed=[ 'fasta', ldda_message, '2.fasta' ] )
def test_085_mark_ldda2_deleted( self ):
"""Testing marking ldda2 as deleted"""
# Logged in as admin_user
diff -r 627260f26eba2116bc6bfecf285d7d04d12a912b -r e69c868b6891ccf6deabcba8a767cdf31593d1be test/functional/test_library_security.py
--- a/test/functional/test_library_security.py
+++ b/test/functional/test_library_security.py
@@ -157,7 +157,7 @@
assert ldda1 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda1 from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ ldda1.name, ldda1.message, admin_user.email ] )
+ strings_displayed=[ ldda1.name, ldda1.message, 'bed' ] )
def test_030_access_ldda1_with_private_role_restriction( self ):
"""Testing accessing ldda1 with a private role restriction"""
# Logged in as admin_user
@@ -262,20 +262,20 @@
assert ldda2 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda2 from the database'
self.browse_library( cntrller='library',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ ldda2.name, ldda2.message, admin_user.email ] )
+ strings_displayed=[ ldda2.name, ldda2.message, 'bed' ] )
def test_045_accessing_ldda2_with_role_associated_with_group_and_users( self ):
"""Testing accessing ldda2 with a role that is associated with a group and users"""
# Logged in as admin_user
# admin_user should be able to see 2.bed since she is associated with role2
self.browse_library( cntrller='library',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ ldda2.name, ldda2.message, admin_user.email ] )
+ strings_displayed=[ ldda2.name, ldda2.message, 'bed' ] )
self.logout()
# regular_user1 should be able to see 2.bed since she is associated with group_two
self.login( email = regular_user1.email )
self.browse_library( cntrller='library',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ folder1.name, ldda2.name, ldda2.message, admin_user.email ] )
+ strings_displayed=[ folder1.name, ldda2.name, ldda2.message, 'bed' ] )
# Check the permissions on the dataset 2.bed - they are as folows:
# DATASET_MANAGE_PERMISSIONS = test(a)bx.psu.edu
# DATASET_ACCESS = Role2
@@ -356,7 +356,7 @@
strings_displayed=[ "Upload a directory of files" ] )
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ admin_user.email, ldda_message ] )
+ strings_displayed=[ 'bed', ldda_message ] )
def test_055_change_permissions_on_datasets_uploaded_from_library_dir( self ):
"""Testing changing the permissions on datasets uploaded from a directory from the Admin view"""
# logged in as admin_user
diff -r 627260f26eba2116bc6bfecf285d7d04d12a912b -r e69c868b6891ccf6deabcba8a767cdf31593d1be test/functional/test_library_templates.py
--- a/test/functional/test_library_templates.py
+++ b/test/functional/test_library_templates.py
@@ -181,7 +181,7 @@
assert ldda1 is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda1 from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library1.id ),
- strings_displayed=[ ldda1.name, ldda1.message, admin_user.email ] )
+ strings_displayed=[ ldda1.name, ldda1.message, 'bed' ] )
# Make sure the library template contents were correctly saved
self.ldda_edit_info( 'library_admin',
self.security.encode_id( library1.id ),
@@ -308,7 +308,7 @@
assert ldda is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library2.id ),
- strings_displayed=[ ldda.name, ldda.message, admin_user.email ] )
+ strings_displayed=[ ldda.name, ldda.message, 'bed' ] )
# Make sure the library template contents were correctly saved
self.ldda_edit_info( 'library_admin',
self.security.encode_id( library2.id ),
@@ -363,8 +363,7 @@
'Option1' ] )
def test_100_add_ldda_to_folder3( self ):
"""
- Testing adding a new library dataset to library3's folder,
- making sure the SelectField setting is correct on the upload form.
+ Testing adding a new library dataset to library3's folder, making sure the SelectField setting is correct on the upload form.
"""
filename = '3.bed'
ldda_message = '3.bed message'
@@ -381,7 +380,7 @@
assert ldda is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library3.id ),
- strings_displayed=[ ldda.name, ldda.message, admin_user.email ] )
+ strings_displayed=[ ldda.name, ldda.message, 'bed' ] )
# Make sure the library template contents were correctly saved
self.ldda_edit_info( 'library_admin',
self.security.encode_id( library3.id ),
@@ -453,8 +452,7 @@
'This text should be inherited' ] )
def test_120_add_ldda_to_folder4( self ):
"""
- Testing adding a new library dataset to library4's folder,
- making sure the TextArea setting is correct on the upload form.
+ Testing adding a new library dataset to library4's folder, making sure the TextArea setting is correct on the upload form.
"""
filename = '4.bed'
ldda_message = '4.bed message'
@@ -471,7 +469,7 @@
assert ldda is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library4.id ),
- strings_displayed=[ ldda.name, ldda.message, admin_user.email ] )
+ strings_displayed=[ ldda.name, ldda.message, 'bed' ] )
# Make sure the library template contents were correctly saved
self.ldda_edit_info( 'library_admin',
self.security.encode_id( library4.id ),
@@ -519,8 +517,7 @@
'This text should be inherited' ] )
def test_140_add_ldda_to_folder5( self ):
"""
- Testing adding a new library dataset to library5's folder,
- making sure the TextField setting is correct on the upload form.
+ Testing adding a new library dataset to library5's folder, making sure the TextField setting is correct on the upload form.
"""
# Logged in as admin_user
filename = '5.bed'
@@ -537,7 +534,7 @@
assert ldda is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library5.id ),
- strings_displayed=[ ldda.name, ldda.message, admin_user.email ] )
+ strings_displayed=[ ldda.name, ldda.message, 'bed' ] )
# Make sure the library template contents were correctly saved
self.ldda_edit_info( 'library_admin',
self.security.encode_id( library5.id ),
@@ -558,9 +555,7 @@
field_default_1='%s default' % TextArea_form.name )
def test_150_add_ldda_to_library5( self ):
"""
- Testing adding a new library dataset to library5's folder,
- making sure the TextField and new TextArea settings are
- correct on the upload form.
+ Testing adding a new library dataset to library5's folder, making sure the TextField and new TextArea settings are correct on the upload form.
"""
filename = '6.bed'
ldda_message = '6.bed message'
@@ -579,7 +574,7 @@
assert ldda is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library5.id ),
- strings_displayed=[ ldda.name, ldda.message, admin_user.email ] )
+ strings_displayed=[ ldda.name, ldda.message, 'bed' ] )
# Make sure the library template contents were correctly saved
self.ldda_edit_info( 'library_admin',
self.security.encode_id( library5.id ),
@@ -626,8 +621,7 @@
'none' ] )
def test_170_add_ldda_to_folder6( self ):
"""
- Testing adding a new library dataset to library6's folder,
- making sure the WorkflowField setting is correct on the upload form.
+ Testing adding a new library dataset to library6's folder, making sure the WorkflowField setting is correct on the upload form.
"""
# Logged in as admin_user
filename = '7.bed'
@@ -644,7 +638,7 @@
assert ldda is not None, 'Problem retrieving LibraryDatasetDatasetAssociation ldda from the database'
self.browse_library( cntrller='library_admin',
library_id=self.security.encode_id( library6.id ),
- strings_displayed=[ ldda.name, ldda.message, admin_user.email ] )
+ strings_displayed=[ ldda.name, ldda.message, 'bed' ] )
# Make sure the library template contents were correctly saved
self.ldda_edit_info( 'library_admin',
self.security.encode_id( library6.id ),
diff -r 627260f26eba2116bc6bfecf285d7d04d12a912b -r e69c868b6891ccf6deabcba8a767cdf31593d1be test/functional/test_user_info.py
--- a/test/functional/test_user_info.py
+++ b/test/functional/test_user_info.py
@@ -206,7 +206,7 @@
self.edit_user_info( cntrller='user',
password='testuser',
new_password='testuser#',\
- strings_displayed_after_submit=[ 'The password has been changed.' ] )
+ strings_displayed_after_submit=[ 'The password has been changed' ] )
self.logout()
refresh( regular_user12 )
# Test logging in with new email and password
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

03 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/627260f26eba/
changeset: 627260f26eba
user: jgoecks
date: 2011-12-04 00:05:58
summary: Trackster: decode dbkey when adding tracks.
affected #: 1 file
diff -r 7e5b7ae22a29ec69cbcfb28c8820e70b6d2ca89c -r 627260f26eba2116bc6bfecf285d7d04d12a912b lib/galaxy/web/controllers/tracks.py
--- a/lib/galaxy/web/controllers/tracks.py
+++ b/lib/galaxy/web/controllers/tracks.py
@@ -30,6 +30,13 @@
OK = "ok"
)
+def _decode_dbkey( dbkey ):
+ """ Decodes dbkey and returns tuple ( username, dbkey )"""
+ if ':' in dbkey:
+ return dbkey.split( ':' )
+ else:
+ return None, dbkey
+
class NameColumn( grids.TextColumn ):
def get_value( self, trans, grid, history ):
return history.get_display_name()
@@ -92,6 +99,7 @@
def filter( self, trans, user, query, dbkey ):
""" Filter by dbkey; datasets without a dbkey are returned as well. """
# use raw SQL b/c metadata is a BLOB
+ dbkey_user, dbkey = _decode_dbkey( dbkey )
dbkey = dbkey.replace("'", "\\'")
return query.filter( or_( \
or_( "metadata like '%%\"dbkey\": [\"%s\"]%%'" % dbkey, "metadata like '%%\"dbkey\": \"%s\"%%'" % dbkey ), \
@@ -213,13 +221,6 @@
return True
return False
-
- def _decode_dbkey( self, dbkey ):
- """ Decodes dbkey and returns tuple ( username, dbkey )"""
- if ':' in dbkey:
- return dbkey.split( ':' )
- else:
- return None, dbkey
@web.expose
@web.require_login()
@@ -323,7 +324,7 @@
low = 0
# If there is no dbkey owner, default to current user.
- dbkey_owner, dbkey = self._decode_dbkey( dbkey )
+ dbkey_owner, dbkey = _decode_dbkey( dbkey )
if dbkey_owner:
dbkey_user = trans.sa_session.query( trans.app.model.User ).filter_by( username=dbkey_owner ).first()
else:
@@ -429,7 +430,7 @@
"""
# If there is no dbkey owner, default to current user.
- dbkey_owner, dbkey = self._decode_dbkey( dbkey )
+ dbkey_owner, dbkey = _decode_dbkey( dbkey )
if dbkey_owner:
dbkey_user = trans.sa_session.query( trans.app.model.User ).filter_by( username=dbkey_owner ).first()
else:
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Remove unneeded vis_id parameter from tracks/chroms method.
by Bitbucket 03 Dec '11
by Bitbucket 03 Dec '11
03 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/7e5b7ae22a29/
changeset: 7e5b7ae22a29
user: jgoecks
date: 2011-12-03 23:06:33
summary: Remove unneeded vis_id parameter from tracks/chroms method.
affected #: 2 files
diff -r 866567627a0f78ee78fe443c65a43a86229802b4 -r 7e5b7ae22a29ec69cbcfb28c8820e70b6d2ca89c lib/galaxy/web/controllers/tracks.py
--- a/lib/galaxy/web/controllers/tracks.py
+++ b/lib/galaxy/web/controllers/tracks.py
@@ -293,7 +293,7 @@
return trans.fill_template( 'tracks/browser.mako', config=viz_config, add_dataset=new_dataset )
@web.json
- def chroms( self, trans, vis_id=None, dbkey=None, num=None, chrom=None, low=None ):
+ def chroms( self, trans, dbkey=None, num=None, chrom=None, low=None ):
"""
Returns a naturally sorted list of chroms/contigs for either a given visualization or a given dbkey.
Use either chrom or low to specify the starting chrom in the return list.
@@ -322,25 +322,12 @@
else:
low = 0
- #
- # Get viz, dbkey.
- #
-
- # Must specify either vis_id or dbkey.
- if not vis_id and not dbkey:
- return trans.show_error_message("No visualization id or dbkey specified.")
-
- # Need to get user and dbkey in order to get chroms data.
- if vis_id:
- # Use user, dbkey from viz.
- visualization = self.get_visualization( trans, vis_id, check_ownership=False, check_accessible=True )
- visualization.config = self.get_visualization_config( trans, visualization )
- vis_user = visualization.user
- vis_dbkey = visualization.dbkey
+ # If there is no dbkey owner, default to current user.
+ dbkey_owner, dbkey = self._decode_dbkey( dbkey )
+ if dbkey_owner:
+ dbkey_user = trans.sa_session.query( trans.app.model.User ).filter_by( username=dbkey_owner ).first()
else:
- # No vis_id, so visualization is new. User is current user, dbkey must be given.
- vis_user = trans.user
- vis_dbkey = dbkey
+ dbkey_user = trans.user
#
# Get len file.
@@ -348,21 +335,21 @@
len_file = None
len_ds = None
user_keys = {}
- if 'dbkeys' in vis_user.preferences:
- user_keys = from_json_string( vis_user.preferences['dbkeys'] )
- if vis_dbkey in user_keys:
- dbkey_attributes = user_keys[ vis_dbkey ]
+ if 'dbkeys' in dbkey_user.preferences:
+ user_keys = from_json_string( dbkey_user.preferences['dbkeys'] )
+ if dbkey in user_keys:
+ dbkey_attributes = user_keys[ dbkey ]
if 'fasta' in dbkey_attributes:
build_fasta = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( dbkey_attributes[ 'fasta' ] )
len_file = build_fasta.get_converted_dataset( trans, 'len' ).file_name
# Backwards compatibility: look for len file directly.
elif 'len' in dbkey_attributes:
- len_file = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( user_keys[ vis_dbkey ][ 'len' ] ).file_name
+ len_file = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( user_keys[ dbkey ][ 'len' ] ).file_name
if not len_file:
len_ds = trans.db_dataset_for( dbkey )
if not len_ds:
- len_file = os.path.join( trans.app.config.len_file_path, "%s.len" % vis_dbkey )
+ len_file = os.path.join( trans.app.config.len_file_path, "%s.len" % dbkey )
else:
len_file = len_ds.file_name
@@ -432,7 +419,7 @@
to_sort = [{ 'chrom': chrom, 'len': length } for chrom, length in chroms.iteritems()]
to_sort.sort(lambda a,b: cmp( split_by_number(a['chrom']), split_by_number(b['chrom']) ))
- return { 'reference': self._has_reference_data( trans, vis_dbkey, vis_user ), 'chrom_info': to_sort,
+ return { 'reference': self._has_reference_data( trans, dbkey, dbkey_user ), 'chrom_info': to_sort,
'prev_chroms' : prev_chroms, 'next_chroms' : next_chroms, 'start_index' : start_index }
@web.json
diff -r 866567627a0f78ee78fe443c65a43a86229802b4 -r 7e5b7ae22a29ec69cbcfb28c8820e70b6d2ca89c static/scripts/trackster.js
--- a/static/scripts/trackster.js
+++ b/static/scripts/trackster.js
@@ -1195,7 +1195,8 @@
*/
load_chroms: function(url_parms) {
url_parms['num'] = MAX_CHROMS_SELECTABLE;
- $.extend( url_parms, (this.vis_id !== undefined ? { vis_id: this.vis_id } : { dbkey: this.dbkey } ) );
+ url_parms['dbkey'] = this.dbkey;
+
var
view = this,
chrom_data = $.Deferred();
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Prepend user's public name to custom builds so that this information can be shown and used to display reference data.
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/866567627a0f/
changeset: 866567627a0f
user: jgoecks
date: 2011-12-02 22:58:15
summary: Prepend user's public name to custom builds so that this information can be shown and used to display reference data.
affected #: 3 files
diff -r 28ffb6286b1226d8c7757e33bb7790eaa08ae29d -r 866567627a0f78ee78fe443c65a43a86229802b4 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -359,7 +359,18 @@
'drawables': drawables,
'prefs': collection_dict.get( 'prefs', [] )
}
-
+
+ def encode_dbkey( dbkey ):
+ """
+ Encodes dbkey as needed. For now, prepends user's public name
+ to custom dbkey keys.
+ """
+ encoded_dbkey = dbkey
+ user = visualization.user
+ if 'dbkeys' in user.preferences and dbkey in user.preferences[ 'dbkeys' ]:
+ encoded_dbkey = "%s:%s" % ( user.username, dbkey )
+ return encoded_dbkey
+
# Set tracks.
tracks = []
if 'tracks' in latest_revision.config:
@@ -373,8 +384,12 @@
else:
tracks.append( pack_collection( drawable_dict ) )
- config = { "title": visualization.title, "vis_id": trans.security.encode_id( visualization.id ),
- "tracks": tracks, "bookmarks": bookmarks, "chrom": "", "dbkey": visualization.dbkey }
+ config = { "title": visualization.title,
+ "vis_id": trans.security.encode_id( visualization.id ),
+ "tracks": tracks,
+ "bookmarks": bookmarks,
+ "chrom": "",
+ "dbkey": encode_dbkey( visualization.dbkey ) }
if 'viewport' in latest_revision.config:
config['viewport'] = latest_revision.config['viewport']
diff -r 28ffb6286b1226d8c7757e33bb7790eaa08ae29d -r 866567627a0f78ee78fe443c65a43a86229802b4 lib/galaxy/web/controllers/tracks.py
--- a/lib/galaxy/web/controllers/tracks.py
+++ b/lib/galaxy/web/controllers/tracks.py
@@ -189,7 +189,11 @@
avail_genomes[key] = path
self.available_genomes = avail_genomes
- def _has_reference_data( self, trans, dbkey ):
+ def _has_reference_data( self, trans, dbkey, dbkey_owner=None ):
+ """
+ Returns true if there is reference data for the specified dbkey. If dbkey is custom,
+ dbkey_owner is needed to determine if there is reference data.
+ """
# Initialize built-in builds if necessary.
if not self.available_genomes:
self._init_references( trans )
@@ -198,12 +202,10 @@
if dbkey in self.available_genomes:
# There is built-in reference data.
return True
-
- # Look for key in user's custom builds.
- # TODO: how to make this work for shared visualizations?
- user = trans.user
- if user and 'dbkeys' in trans.user.preferences:
- user_keys = from_json_string( user.preferences['dbkeys'] )
+
+ # Look for key in owner's custom builds.
+ if dbkey_owner and 'dbkeys' in dbkey_owner.preferences:
+ user_keys = from_json_string( dbkey_owner.preferences[ 'dbkeys' ] )
if dbkey in user_keys:
dbkey_attributes = user_keys[ dbkey ]
if 'fasta' in dbkey_attributes:
@@ -211,6 +213,13 @@
return True
return False
+
+ def _decode_dbkey( self, dbkey ):
+ """ Decodes dbkey and returns tuple ( username, dbkey )"""
+ if ':' in dbkey:
+ return dbkey.split( ':' )
+ else:
+ return None, dbkey
@web.expose
@web.require_login()
@@ -272,7 +281,7 @@
@web.require_login()
def browser(self, trans, id, chrom="", **kwargs):
"""
- Display browser for the datasets listed in `dataset_ids`.
+ Display browser for the visualization denoted by id and add the datasets listed in `dataset_ids`.
"""
vis = self.get_visualization( trans, id, check_ownership=False, check_accessible=True )
viz_config = self.get_visualization_config( trans, vis )
@@ -356,7 +365,7 @@
len_file = os.path.join( trans.app.config.len_file_path, "%s.len" % vis_dbkey )
else:
len_file = len_ds.file_name
-
+
#
# Get chroms data:
# (a) chrom name, len;
@@ -423,7 +432,7 @@
to_sort = [{ 'chrom': chrom, 'len': length } for chrom, length in chroms.iteritems()]
to_sort.sort(lambda a,b: cmp( split_by_number(a['chrom']), split_by_number(b['chrom']) ))
- return { 'reference': self._has_reference_data( trans, vis_dbkey ), 'chrom_info': to_sort,
+ return { 'reference': self._has_reference_data( trans, vis_dbkey, vis_user ), 'chrom_info': to_sort,
'prev_chroms' : prev_chroms, 'next_chroms' : next_chroms, 'start_index' : start_index }
@web.json
@@ -432,7 +441,14 @@
Return reference data for a build.
"""
- if not self._has_reference_data( trans, dbkey ):
+ # If there is no dbkey owner, default to current user.
+ dbkey_owner, dbkey = self._decode_dbkey( dbkey )
+ if dbkey_owner:
+ dbkey_user = trans.sa_session.query( trans.app.model.User ).filter_by( username=dbkey_owner ).first()
+ else:
+ dbkey_user = trans.user
+
+ if not self._has_reference_data( trans, dbkey, dbkey_user ):
return None
#
@@ -444,9 +460,7 @@
twobit_file_name = self.available_genomes[dbkey]
else:
# From custom build.
- # TODO: how to make this work for shared visualizations?
- user = trans.user
- user_keys = from_json_string( user.preferences['dbkeys'] )
+ user_keys = from_json_string( dbkey_user.preferences['dbkeys'] )
dbkey_attributes = user_keys[ dbkey ]
fasta_dataset = trans.app.model.HistoryDatasetAssociation.get( dbkey_attributes[ 'fasta' ] )
error = self._convert_dataset( trans, fasta_dataset, 'twobit' )
diff -r 28ffb6286b1226d8c7757e33bb7790eaa08ae29d -r 866567627a0f78ee78fe443c65a43a86229802b4 templates/visualization/display.mako
--- a/templates/visualization/display.mako
+++ b/templates/visualization/display.mako
@@ -8,7 +8,7 @@
<script type='text/javascript' src="${h.url_for('/static/scripts/excanvas.js')}"></script><![endif]-->
- ${h.js( "jquery.event.drag", "jquery.autocomplete", "jquery.mousewheel", "jquery.autocomplete", "trackster", "trackster_ui", "jquery.ui.sortable.slider", "jquery.scrollTo", "farbtastic" )}
+ ${h.js( "jquery.event.drag", "jquery.autocomplete", "jquery.mousewheel", "jquery.autocomplete", "trackster", "trackster_ui", "jquery.ui.sortable.slider", "farbtastic" )}
</%def><%def name="stylesheets()">
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Change column Uploaded By to be the Data type of the library dataset when browsing a data library. Fixes issue #597.
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/28ffb6286b12/
changeset: 28ffb6286b12
user: greg
date: 2011-12-02 21:19:37
summary: Change column Uploaded By to be the Data type of the library dataset when browsing a data library. Fixes issue #597.
affected #: 1 file
diff -r 954f201a772e82039ec80802f9bca78d8903cd8a -r 28ffb6286b1226d8c7757e33bb7790eaa08ae29d templates/library/common/browse_library.mako
--- a/templates/library/common/browse_library.mako
+++ b/templates/library/common/browse_library.mako
@@ -218,10 +218,6 @@
is_admin = trans.user_is_admin() and cntrller == 'library_admin'
- if ldda.user:
- uploaded_by = ldda.user.email
- else:
- uploaded_by = 'anonymous'
if ldda == library_dataset.library_dataset_dataset_association:
current_version = True
if is_admin:
@@ -303,7 +299,7 @@
</td>
% if not simple:
<td id="libraryItemInfo">${render_library_item_info( ldda )}</td>
- <td>${uploaded_by}</td>
+ <td>${ldda.extension}</td>
% endif
<td>${ldda.create_time.strftime( "%Y-%m-%d" )}</td><td>${ldda.get_size( nice_size=True )}</td>
@@ -579,10 +575,10 @@
</th>
% if not simple:
<th>Message</th>
- <th>Uploaded By</th>
+ <th>Data type</th>
% endif
- <th>Date</th>
- <th>File Size</th>
+ <th>Date uploaded</th>
+ <th>File size</th></tr></thead><% row_counter = RowCounter() %>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Fix bug when displaying shared visualizations with single quotes.
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/954f201a772e/
changeset: 954f201a772e
user: jgoecks
date: 2011-12-02 20:50:36
summary: Fix bug when displaying shared visualizations with single quotes.
affected #: 1 file
diff -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 -r 954f201a772e82039ec80802f9bca78d8903cd8a templates/visualization/display.mako
--- a/templates/visualization/display.mako
+++ b/templates/visualization/display.mako
@@ -94,7 +94,7 @@
view = create_visualization( container_element, "${config.get('title') | h}",
"${config.get('vis_id')}", "${config.get('dbkey')}",
JSON.parse('${ h.to_json_string( config.get( 'viewport', dict() ) ) }'),
- JSON.parse('${ h.to_json_string( config.get('tracks') ) }'),
+ JSON.parse('${ h.to_json_string( config['tracks'] ).replace("'", "\\'") }'),
JSON.parse('${ h.to_json_string( config.get('bookmarks') ) }')
);
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: 1. Add 3 columns to the tool_shed_repository table in the Galaxy model / database; metadata (json), includes_datatypes (boolean) and update_available (boolean)
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/5340450cdab2/
changeset: 5340450cdab2
user: greg
date: 2011-12-02 20:49:43
summary: 1. Add 3 columns to the tool_shed_repository table in the Galaxy model / database; metadata (json), includes_datatypes (boolean) and update_available (boolean)
2. Add the ability to set metadata for tool shed repositories installed into local Galaxy instances (at time of install and on the Manage repository page)
3. Add the ability to view tool shed repository metadata (including tool guids) when managing installed tool shed repositories within Galaxy
4. Create a new admin_toolshed controller, and move all tool shed related code from the admin controller to this new controller
affected #: 16 files
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py
+++ b/lib/galaxy/model/__init__.py
@@ -2646,7 +2646,8 @@
pass
class ToolShedRepository( object ):
- def __init__( self, id=None, create_time=None, tool_shed=None, name=None, description=None, owner=None, changeset_revision=None, deleted=False ):
+ def __init__( self, id=None, create_time=None, tool_shed=None, name=None, description=None, owner=None,
+ changeset_revision=None, metadata=None, includes_datatypes=False, update_available=False, deleted=False ):
self.id = id
self.create_time = create_time
self.tool_shed = tool_shed
@@ -2654,6 +2655,9 @@
self.description = description
self.owner = owner
self.changeset_revision = changeset_revision
+ self.metadata = metadata
+ self.includes_datatypes = includes_datatypes
+ self.update_available = update_available
self.deleted = deleted
## ---- Utility methods -------------------------------------------------------
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py
+++ b/lib/galaxy/model/mapping.py
@@ -373,6 +373,9 @@
Column( "description" , TEXT ),
Column( "owner", TrimmedString( 255 ), index=True ),
Column( "changeset_revision", TrimmedString( 255 ), index=True ),
+ Column( "metadata", JSONType, nullable=True ),
+ Column( "includes_datatypes", Boolean, index=True, default=False ),
+ Column( "update_available", Boolean, default=False ),
Column( "deleted", Boolean, index=True, default=False ) )
Job.table = Table( "job", metadata,
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 lib/galaxy/model/migrate/versions/0086_add_tool_shed_repository_table_columns.py
--- /dev/null
+++ b/lib/galaxy/model/migrate/versions/0086_add_tool_shed_repository_table_columns.py
@@ -0,0 +1,80 @@
+"""
+Migration script to add the metadata, update_available and includes_datatypes columns to the tool_shed_repository table.
+"""
+
+from sqlalchemy import *
+from sqlalchemy.orm import *
+from migrate import *
+from migrate.changeset import *
+
+import datetime
+now = datetime.datetime.utcnow
+# Need our custom types, but don't import anything else from model
+from galaxy.model.custom_types import *
+
+import sys, logging
+log = logging.getLogger( __name__ )
+log.setLevel(logging.DEBUG)
+handler = logging.StreamHandler( sys.stdout )
+format = "%(name)s %(levelname)s %(asctime)s %(message)s"
+formatter = logging.Formatter( format )
+handler.setFormatter( formatter )
+log.addHandler( handler )
+
+metadata = MetaData( migrate_engine )
+db_session = scoped_session( sessionmaker( bind=migrate_engine, autoflush=False, autocommit=True ) )
+
+def upgrade():
+ print __doc__
+ metadata.reflect()
+ ToolShedRepository_table = Table( "tool_shed_repository", metadata, autoload=True )
+ c = Column( "metadata", JSONType(), nullable=True )
+ try:
+ c.create( ToolShedRepository_table )
+ assert c is ToolShedRepository_table.c.metadata
+ except Exception, e:
+ print "Adding metadata column to the tool_shed_repository table failed: %s" % str( e )
+ log.debug( "Adding metadata column to the tool_shed_repository table failed: %s" % str( e ) )
+ c = Column( "includes_datatypes", Boolean, index=True, default=False )
+ try:
+ c.create( ToolShedRepository_table )
+ assert c is ToolShedRepository_table.c.includes_datatypes
+ if migrate_engine.name == 'mysql' or migrate_engine.name == 'sqlite':
+ default_false = "0"
+ elif migrate_engine.name == 'postgres':
+ default_false = "false"
+ db_session.execute( "UPDATE tool_shed_repository SET includes_datatypes=%s" % default_false )
+ except Exception, e:
+ print "Adding includes_datatypes column to the tool_shed_repository table failed: %s" % str( e )
+ log.debug( "Adding includes_datatypes column to the tool_shed_repository table failed: %s" % str( e ) )
+ c = Column( "update_available", Boolean, default=False )
+ try:
+ c.create( ToolShedRepository_table )
+ assert c is ToolShedRepository_table.c.update_available
+ if migrate_engine.name == 'mysql' or migrate_engine.name == 'sqlite':
+ default_false = "0"
+ elif migrate_engine.name == 'postgres':
+ default_false = "false"
+ db_session.execute( "UPDATE tool_shed_repository SET update_available=%s" % default_false )
+ except Exception, e:
+ print "Adding update_available column to the tool_shed_repository table failed: %s" % str( e )
+ log.debug( "Adding update_available column to the tool_shed_repository table failed: %s" % str( e ) )
+
+def downgrade():
+ metadata.reflect()
+ ToolShedRepository_table = Table( "tool_shed_repository", metadata, autoload=True )
+ try:
+ ToolShedRepository_table.c.metadata.drop()
+ except Exception, e:
+ print "Dropping column metadata from the tool_shed_repository table failed: %s" % str( e )
+ log.debug( "Dropping column metadata from the tool_shed_repository table failed: %s" % str( e ) )
+ try:
+ ToolShedRepository_table.c.includes_datatypes.drop()
+ except Exception, e:
+ print "Dropping column includes_datatypes from the tool_shed_repository table failed: %s" % str( e )
+ log.debug( "Dropping column includes_datatypes from the tool_shed_repository table failed: %s" % str( e ) )
+ try:
+ ToolShedRepository_table.c.update_available.drop()
+ except Exception, e:
+ print "Dropping column update_available from the tool_shed_repository table failed: %s" % str( e )
+ log.debug( "Dropping column update_available from the tool_shed_repository table failed: %s" % str( e ) )
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 lib/galaxy/web/controllers/admin.py
--- a/lib/galaxy/web/controllers/admin.py
+++ b/lib/galaxy/web/controllers/admin.py
@@ -5,13 +5,13 @@
from galaxy.tools.search import ToolBoxSearch
from galaxy.tools import ToolSection, json_fix
from galaxy.util import parse_xml, inflector
-import logging
-log = logging.getLogger( __name__ )
-
from galaxy.actions.admin import AdminActions
from galaxy.web.params import QuotaParamParser
from galaxy.exceptions import *
import galaxy.datatypes.registry
+import logging
+
+log = logging.getLogger( __name__ )
class UserListGrid( grids.Grid ):
class EmailColumn( grids.TextColumn ):
@@ -383,63 +383,12 @@
preserve_state = False
use_paging = True
-class RepositoryListGrid( grids.Grid ):
- class NameColumn( grids.TextColumn ):
- def get_value( self, trans, grid, tool_shed_repository ):
- return tool_shed_repository.name
- class DescriptionColumn( grids.TextColumn ):
- def get_value( self, trans, grid, tool_shed_repository ):
- return tool_shed_repository.description
- class OwnerColumn( grids.TextColumn ):
- def get_value( self, trans, grid, tool_shed_repository ):
- return tool_shed_repository.owner
- class RevisionColumn( grids.TextColumn ):
- def get_value( self, trans, grid, tool_shed_repository ):
- return tool_shed_repository.changeset_revision
- class ToolShedColumn( grids.TextColumn ):
- def get_value( self, trans, grid, tool_shed_repository ):
- return tool_shed_repository.tool_shed
- # Grid definition
- title = "Installed tool shed repositories"
- model_class = model.ToolShedRepository
- template='/admin/tool_shed_repository/grid.mako'
- default_sort_key = "name"
- columns = [
- NameColumn( "Name",
- key="name",
- link=( lambda item: dict( operation="manage_repository", id=item.id, webapp="galaxy" ) ),
- attach_popup=False ),
- DescriptionColumn( "Description" ),
- OwnerColumn( "Owner" ),
- RevisionColumn( "Revision" ),
- ToolShedColumn( "Tool shed" ),
- # Columns that are valid for filtering but are not visible.
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
- ]
- columns.append( grids.MulticolFilterColumn( "Search repository name",
- cols_to_filter=[ columns[0] ],
- key="free-text-search",
- visible=False,
- filterable="standard" ) )
- standard_filters = []
- default_filter = dict( deleted="False" )
- num_rows_per_page = 50
- preserve_state = False
- use_paging = True
- def build_initial_query( self, trans, **kwd ):
- return trans.sa_session.query( self.model_class ) \
- .filter( self.model_class.table.c.deleted == False )
-
class AdminGalaxy( BaseUIController, Admin, AdminActions, UsesQuota, QuotaParamParser ):
user_list_grid = UserListGrid()
role_list_grid = RoleListGrid()
group_list_grid = GroupListGrid()
quota_list_grid = QuotaListGrid()
- repository_list_grid = RepositoryListGrid()
delete_operation = grids.GridOperation( "Delete", condition=( lambda item: not item.deleted ), allow_multiple=True )
undelete_operation = grids.GridOperation( "Undelete", condition=( lambda item: item.deleted and not item.purged ), allow_multiple=True )
purge_operation = grids.GridOperation( "Purge", condition=( lambda item: item.deleted and not item.purged ), allow_multiple=True )
@@ -676,391 +625,6 @@
return quota, params
@web.expose
@web.require_admin
- def browse_tool_shed_repository( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- repository = get_repository( trans, kwd[ 'id' ] )
- relative_install_dir = self.__get_relative_install_dir( trans, repository )
- repo_files_dir = os.path.abspath( os.path.join( relative_install_dir, repository.name ) )
- tool_dicts = []
- workflow_dicts = []
- for root, dirs, files in os.walk( repo_files_dir ):
- if not root.find( '.hg' ) >= 0 and not root.find( 'hgrc' ) >= 0:
- if '.hg' in dirs:
- # Don't visit .hg directories.
- dirs.remove( '.hg' )
- if 'hgrc' in files:
- # Don't include hgrc files.
- files.remove( 'hgrc' )
- for name in files:
- # Find all tool configs.
- if name.endswith( '.xml' ):
- try:
- full_path = os.path.abspath( os.path.join( root, name ) )
- tool = trans.app.toolbox.load_tool( full_path )
- if tool is not None:
- tool_config = os.path.join( root, name )
- # Handle tool.requirements.
- tool_requirements = []
- for tr in tool.requirements:
- name=tr.name
- type=tr.type
- if type == 'fabfile':
- version = None
- fabfile = tr.fabfile
- method = tr.method
- else:
- version = tr.version
- fabfile = None
- method = None
- requirement_dict = dict( name=name,
- type=type,
- version=version,
- fabfile=fabfile,
- method=method )
- tool_requirements.append( requirement_dict )
- tool_dict = dict( id=tool.id,
- old_id=tool.old_id,
- name=tool.name,
- version=tool.version,
- description=tool.description,
- requirements=tool_requirements,
- tool_config=tool_config )
- tool_dicts.append( tool_dict )
- except Exception, e:
- # The file is not a Galaxy tool config.
- pass
- # Find all exported workflows
- elif name.endswith( '.ga' ):
- try:
- full_path = os.path.abspath( os.path.join( root, name ) )
- # Convert workflow data from json
- fp = open( full_path, 'rb' )
- workflow_text = fp.read()
- fp.close()
- workflow_dict = from_json_string( workflow_text )
- if workflow_dict[ 'a_galaxy_workflow' ] == 'true':
- workflow_dicts.append( dict( full_path=full_path, workflow_dict=workflow_dict ) )
- except Exception, e:
- # The file is not a Galaxy workflow.
- pass
- return trans.fill_template( '/admin/tool_shed_repository/browse_repository.mako',
- repository=repository,
- tool_dicts=tool_dicts,
- workflow_dicts=workflow_dicts,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def browse_tool_shed_repositories( self, trans, **kwd ):
- if 'operation' in kwd:
- operation = kwd.pop( 'operation' ).lower()
- if operation == "manage_repository":
- return self.manage_tool_shed_repository( trans, **kwd )
- # Render the list view
- return self.repository_list_grid( trans, **kwd )
- @web.expose
- @web.require_admin
- def browse_tool_sheds( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- return trans.fill_template( '/webapps/galaxy/admin/tool_sheds.mako',
- webapp='galaxy',
- message=message,
- status='error' )
- @web.expose
- @web.require_admin
- def find_tools_in_tool_shed( self, trans, **kwd ):
- tool_shed_url = kwd[ 'tool_shed_url' ]
- galaxy_url = url_for( '', qualified=True )
- url = '%s/repository/find_tools?galaxy_url=%s&webapp=galaxy' % ( tool_shed_url, galaxy_url )
- return trans.response.send_redirect( url )
- @web.expose
- @web.require_admin
- def find_workflows_in_tool_shed( self, trans, **kwd ):
- tool_shed_url = kwd[ 'tool_shed_url' ]
- galaxy_url = url_for( '', qualified=True )
- url = '%s/repository/find_workflows?galaxy_url=%s&webapp=galaxy' % ( tool_shed_url, galaxy_url )
- return trans.response.send_redirect( url )
- @web.expose
- @web.require_admin
- def browse_tool_shed( self, trans, **kwd ):
- tool_shed_url = kwd[ 'tool_shed_url' ]
- galaxy_url = url_for( '', qualified=True )
- url = '%s/repository/browse_valid_repositories?galaxy_url=%s&webapp=galaxy' % ( tool_shed_url, galaxy_url )
- return trans.response.send_redirect( url )
- @web.expose
- @web.require_admin
- def install_tool_shed_repository( self, trans, **kwd ):
- if not trans.app.toolbox.shed_tool_confs:
- message = 'The <b>tool_config_file</b> setting in <b>universe_wsgi.ini</b> must include at least one shed tool configuration file name with a '
- message += '<b><toolbox></b> tag that includes a <b>tool_path</b> attribute value which is a directory relative to the Galaxy installation '
- message += 'directory in order to automatically install tools from a Galaxy tool shed (e.g., the file name <b>shed_tool_conf.xml</b> whose '
- message += '<b><toolbox></b> tag is <b><toolbox tool_path="../shed_tools"></b>).<p/>See the '
- message += '<a href="http://wiki.g2.bx.psu.edu/Tool%20Shed#Automatic_installation_of_Galaxy_tool…" '
- message += 'target="_blank">Automatic installation of Galaxy tool shed repository tools into a local Galaxy instance</a> section of the '
- message += '<a href="http://wiki.g2.bx.psu.edu/Tool%20Shed" target="_blank">Galaxy tool shed wiki</a> for all of the details.'
- return trans.show_error_message( message )
- message = kwd.get( 'message', '' )
- status = kwd.get( 'status', 'done' )
- tool_shed_url = kwd[ 'tool_shed_url' ]
- repo_info_dict = kwd[ 'repo_info_dict' ]
- new_tool_panel_section = kwd.get( 'new_tool_panel_section', '' )
- tool_panel_section = kwd.get( 'tool_panel_section', '' )
- if kwd.get( 'select_tool_panel_section_button', False ):
- shed_tool_conf = kwd[ 'shed_tool_conf' ]
- # Get the tool path.
- for k, tool_path in trans.app.toolbox.shed_tool_confs.items():
- if k == shed_tool_conf:
- break
- if new_tool_panel_section or tool_panel_section:
- if new_tool_panel_section:
- section_id = new_tool_panel_section.lower().replace( ' ', '_' )
- new_section_key = 'section_%s' % str( section_id )
- if new_section_key in trans.app.toolbox.tool_panel:
- # Appending a tool to an existing section in trans.app.toolbox.tool_panel
- log.debug( "Appending to tool panel section: %s" % new_tool_panel_section )
- tool_section = trans.app.toolbox.tool_panel[ new_section_key ]
- else:
- # Appending a new section to trans.app.toolbox.tool_panel
- log.debug( "Loading new tool panel section: %s" % new_tool_panel_section )
- elem = Element( 'section' )
- elem.attrib[ 'name' ] = new_tool_panel_section
- elem.attrib[ 'id' ] = section_id
- tool_section = ToolSection( elem )
- trans.app.toolbox.tool_panel[ new_section_key ] = tool_section
- else:
- section_key = 'section_%s' % tool_panel_section
- tool_section = trans.app.toolbox.tool_panel[ section_key ]
- # Decode the encoded repo_info_dict param value.
- repo_info_dict = tool_shed_decode( repo_info_dict )
- # Clone the repository to the configured location.
- current_working_dir = os.getcwd()
- installed_repository_names = []
- for name, repo_info_tuple in repo_info_dict.items():
- description, repository_clone_url, changeset_revision = repo_info_tuple
- clone_dir = os.path.join( tool_path, self.__generate_tool_path( repository_clone_url, changeset_revision ) )
- if os.path.exists( clone_dir ):
- # Repository and revision has already been cloned.
- # TODO: implement the ability to re-install or revert an existing repository.
- message += 'Revision <b>%s</b> of repository <b>%s</b> was previously installed.<br/>' % ( changeset_revision, name )
- else:
- os.makedirs( clone_dir )
- log.debug( 'Cloning %s...' % repository_clone_url )
- cmd = 'hg clone %s' % repository_clone_url
- tmp_name = tempfile.NamedTemporaryFile().name
- tmp_stderr = open( tmp_name, 'wb' )
- os.chdir( clone_dir )
- proc = subprocess.Popen( args=cmd, shell=True, stderr=tmp_stderr.fileno() )
- returncode = proc.wait()
- os.chdir( current_working_dir )
- tmp_stderr.close()
- if returncode == 0:
- # Add a new record to the tool_shed_repository table if one doesn't
- # already exist. If one exists but is marked deleted, undelete it.
- self.__create_or_undelete_tool_shed_repository( trans, name, description, changeset_revision, repository_clone_url )
- # Update the cloned repository to changeset_revision.
- repo_files_dir = os.path.join( clone_dir, name )
- log.debug( 'Updating cloned repository to revision "%s"...' % changeset_revision )
- cmd = 'hg update -r %s' % changeset_revision
- tmp_name = tempfile.NamedTemporaryFile().name
- tmp_stderr = open( tmp_name, 'wb' )
- os.chdir( repo_files_dir )
- proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
- returncode = proc.wait()
- os.chdir( current_working_dir )
- tmp_stderr.close()
- if returncode == 0:
- # Load data types required by tools.
- self.__load_datatypes( trans, repo_files_dir )
- # Load tools and tool data files required by them.
- sample_files, repository_tools_tups = self.__get_repository_tools_and_sample_files( trans, tool_path, repo_files_dir )
- if repository_tools_tups:
- # Handle missing data table entries for tool parameters that are dynamically generated select lists.
- repository_tools_tups = self.__handle_missing_data_table_entry( trans, tool_path, sample_files, repository_tools_tups )
- # Handle missing index files for tool parameters that are dynamically generated select lists.
- repository_tools_tups = self.__handle_missing_index_file( trans, tool_path, sample_files, repository_tools_tups )
- # Handle tools that use fabric scripts to install dependencies.
- self.__handle_tool_dependencies( current_working_dir, repo_files_dir, repository_tools_tups )
- # Generate an in-memory tool conf section that includes the new tools.
- new_tool_section = self.__generate_tool_panel_section( name,
- repository_clone_url,
- changeset_revision,
- tool_section,
- repository_tools_tups )
- # Create a temporary file to persist the in-memory tool section
- # TODO: Figure out how to do this in-memory using xml.etree.
- tmp_name = tempfile.NamedTemporaryFile().name
- persisted_new_tool_section = open( tmp_name, 'wb' )
- persisted_new_tool_section.write( new_tool_section )
- persisted_new_tool_section.close()
- # Parse the persisted tool panel section
- tree = parse_xml( tmp_name )
- root = tree.getroot()
- # Load the tools in the section into the tool panel.
- trans.app.toolbox.load_section_tag_set( root, trans.app.toolbox.tool_panel, tool_path )
- # Remove the temporary file
- try:
- os.unlink( tmp_name )
- except:
- pass
- # Append the new section to the shed_tool_config file.
- self.__add_shed_tool_conf_entry( trans, shed_tool_conf, new_tool_section )
- if trans.app.toolbox_search.enabled:
- # If search support for tools is enabled, index the new installed tools.
- trans.app.toolbox_search = ToolBoxSearch( trans.app.toolbox )
- installed_repository_names.append( name )
- else:
- tmp_stderr = open( tmp_name, 'rb' )
- message += '%s<br/>' % tmp_stderr.read()
- tmp_stderr.close()
- status = 'error'
- else:
- tmp_stderr = open( tmp_name, 'rb' )
- message += '%s<br/>' % tmp_stderr.read()
- tmp_stderr.close()
- status = 'error'
- if installed_repository_names:
- installed_repository_names.sort()
- num_repositories_installed = len( installed_repository_names )
- message += 'Installed %d %s and all tools were loaded into tool panel section <b>%s</b>:<br/>Installed repositories: ' % \
- ( num_repositories_installed, inflector.cond_plural( num_repositories_installed, 'repository' ), tool_section.name )
- for i, repo_name in enumerate( installed_repository_names ):
- if i == len( installed_repository_names ) -1:
- message += '%s.<br/>' % repo_name
- else:
- message += '%s, ' % repo_name
- return trans.response.send_redirect( web.url_for( controller='admin',
- action='browse_tool_shed_repositories',
- message=message,
- status=status ) )
- else:
- message = 'Choose the section in your tool panel to contain the installed tools.'
- status = 'error'
- if len( trans.app.toolbox.shed_tool_confs.keys() ) > 1:
- shed_tool_conf_select_field = build_shed_tool_conf_select_field( trans )
- shed_tool_conf = None
- else:
- shed_tool_conf = trans.app.toolbox.shed_tool_confs.keys()[0].lstrip( './' )
- shed_tool_conf_select_field = None
- tool_panel_section_select_field = build_tool_panel_section_select_field( trans )
- return trans.fill_template( '/admin/select_tool_panel_section.mako',
- tool_shed_url=tool_shed_url,
- repo_info_dict=repo_info_dict,
- shed_tool_conf=shed_tool_conf,
- shed_tool_conf_select_field=shed_tool_conf_select_field,
- tool_panel_section_select_field=tool_panel_section_select_field,
- new_tool_panel_section=new_tool_panel_section,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def manage_tool_shed_repository( self, trans, **kwd ):
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- repository_id = params.get( 'id', None )
- repository = get_repository( trans, repository_id )
- description = util.restore_text( params.get( 'description', repository.description ) )
- if params.get( 'edit_repository_button', False ):
- if description != repository.description:
- repository.description = description
- trans.sa_session.add( repository )
- trans.sa_session.flush()
- message = "The repository information has been updated."
- relative_install_dir = self.__get_relative_install_dir( trans, repository )
- if relative_install_dir:
- repo_files_dir = os.path.abspath( os.path.join( relative_install_dir, repository.name ) )
- else:
- repo_files_dir = 'unknown'
- return trans.fill_template( '/admin/tool_shed_repository/manage_repository.mako',
- repository=repository,
- description=description,
- repo_files_dir=repo_files_dir,
- message=message,
- status=status )
- @web.expose
- @web.require_admin
- def check_for_updates( self, trans, **kwd ):
- # Send a request to the relevant tool shed to see if there are any updates.
- repository = get_repository( trans, kwd[ 'id' ] )
- tool_shed_url = get_url_from_repository_tool_shed( trans, repository )
- url = '%s/repository/check_for_updates?galaxy_url=%s&name=%s&owner=%s&changeset_revision=%s&webapp=galaxy' % \
- ( tool_shed_url, url_for( '', qualified=True ), repository.name, repository.owner, repository.changeset_revision )
- return trans.response.send_redirect( url )
- @web.expose
- @web.require_admin
- def update_to_changeset_revision( self, trans, **kwd ):
- """Update a cloned repository to the latest revision possible."""
- params = util.Params( kwd )
- message = util.restore_text( params.get( 'message', '' ) )
- status = params.get( 'status', 'done' )
- tool_shed_url = kwd[ 'tool_shed_url' ]
- name = params.get( 'name', None )
- owner = params.get( 'owner', None )
- changeset_revision = params.get( 'changeset_revision', None )
- latest_changeset_revision = params.get( 'latest_changeset_revision', None )
- repository = get_repository_by_shed_name_owner_changeset_revision( trans, tool_shed_url, name, owner, changeset_revision )
- if changeset_revision and latest_changeset_revision:
- if changeset_revision == latest_changeset_revision:
- message = "The cloned tool shed repository named '%s' is current (there are no updates available)." % name
- else:
- current_working_dir = os.getcwd()
- relative_install_dir = self.__get_relative_install_dir( trans, repository )
- if relative_install_dir:
- # Update the cloned repository to changeset_revision.
- repo_files_dir = os.path.join( relative_install_dir, name )
- log.debug( "Updating cloned repository named '%s' from revision '%s' to revision '%s'..." % \
- ( name, changeset_revision, latest_changeset_revision ) )
- cmd = 'hg pull'
- tmp_name = tempfile.NamedTemporaryFile().name
- tmp_stderr = open( tmp_name, 'wb' )
- os.chdir( repo_files_dir )
- proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
- returncode = proc.wait()
- os.chdir( current_working_dir )
- tmp_stderr.close()
- if returncode == 0:
- cmd = 'hg update -r %s' % latest_changeset_revision
- tmp_name = tempfile.NamedTemporaryFile().name
- tmp_stderr = open( tmp_name, 'wb' )
- os.chdir( repo_files_dir )
- proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
- returncode = proc.wait()
- os.chdir( current_working_dir )
- tmp_stderr.close()
- if returncode == 0:
- # Update the repository changeset_revision in the database.
- repository.changeset_revision = latest_changeset_revision
- trans.sa_session.add( repository )
- trans.sa_session.flush()
- message = "The cloned repository named '%s' has been updated to change set revision '%s'." % \
- ( name, latest_changeset_revision )
- else:
- tmp_stderr = open( tmp_name, 'rb' )
- message = tmp_stderr.read()
- tmp_stderr.close()
- status = 'error'
- else:
- tmp_stderr = open( tmp_name, 'rb' )
- message = tmp_stderr.read()
- tmp_stderr.close()
- status = 'error'
- else:
- message = "The directory containing the cloned repository named '%s' cannot be found." % name
- status = 'error'
- else:
- message = "The latest changeset revision could not be retrieved for the repository named '%s'." % name
- status = 'error'
- return trans.response.send_redirect( web.url_for( controller='admin',
- action='manage_tool_shed_repository',
- id=trans.security.encode_id( repository.id ),
- message=message,
- status=status ) )
- @web.expose
- @web.require_admin
def impersonate( self, trans, email=None, **kwd ):
if not trans.app.config.allow_user_impersonation:
return trans.show_error_message( "User impersonation is not enabled in this instance of Galaxy." )
@@ -1079,354 +643,4 @@
if emails is None:
emails = [ u.email for u in trans.sa_session.query( trans.app.model.User ).enable_eagerloads( False ).all() ]
return trans.fill_template( 'admin/impersonate.mako', emails=emails, message=message, status=status )
-
- def __get_relative_install_dir( self, trans, repository ):
- # Get the directory where the repository is install.
- tool_shed = self.__clean_tool_shed_url( repository.tool_shed )
- partial_install_dir = '%s/repos/%s/%s/%s' % ( tool_shed, repository.owner, repository.name, repository.changeset_revision )
- # Get the relative tool installation paths from each of the shed tool configs.
- shed_tool_confs = trans.app.toolbox.shed_tool_confs
- relative_install_dir = None
- # The shed_tool_confs dictionary contains { shed_conf_filename : tool_path } pairs.
- for shed_conf_filename, tool_path in shed_tool_confs.items():
- relative_install_dir = os.path.join( tool_path, partial_install_dir )
- if os.path.isdir( relative_install_dir ):
- break
- return relative_install_dir
- def __handle_missing_data_table_entry( self, trans, tool_path, sample_files, repository_tools_tups ):
- # Inspect each tool to see if any have input parameters that are dynamically
- # generated select lists that require entries in the tool_data_table_conf.xml file.
- missing_data_table_entry = False
- for index, repository_tools_tup in enumerate( repository_tools_tups ):
- tup_path, repository_tool = repository_tools_tup
- if repository_tool.params_with_missing_data_table_entry:
- missing_data_table_entry = True
- break
- if missing_data_table_entry:
- # The repository must contain a tool_data_table_conf.xml.sample file that includes
- # all required entries for all tools in the repository.
- for sample_file in sample_files:
- head, tail = os.path.split( sample_file )
- if tail == 'tool_data_table_conf.xml.sample':
- break
- error, correction_msg = handle_sample_tool_data_table_conf_file( trans, sample_file )
- if error:
- # TODO: Do more here than logging an exception.
- log.debug( exception_msg )
- # Reload the tool into the local list of repository_tools_tups.
- repository_tool = trans.app.toolbox.load_tool( os.path.join( tool_path, tup_path ) )
- repository_tools_tups[ index ] = ( tup_path, repository_tool )
- return repository_tools_tups
- def __handle_missing_index_file( self, trans, tool_path, sample_files, repository_tools_tups ):
- # Inspect each tool to see if it has any input parameters that
- # are dynamically generated select lists that depend on a .loc file.
- missing_files_handled = []
- for index, repository_tools_tup in enumerate( repository_tools_tups ):
- tup_path, repository_tool = repository_tools_tup
- params_with_missing_index_file = repository_tool.params_with_missing_index_file
- for param in params_with_missing_index_file:
- options = param.options
- missing_head, missing_tail = os.path.split( options.missing_index_file )
- if missing_tail not in missing_files_handled:
- # The repository must contain the required xxx.loc.sample file.
- for sample_file in sample_files:
- sample_head, sample_tail = os.path.split( sample_file )
- if sample_tail == '%s.sample' % missing_tail:
- copy_sample_loc_file( trans, sample_file )
- if options.tool_data_table and options.tool_data_table.missing_index_file:
- options.tool_data_table.handle_found_index_file( options.missing_index_file )
- missing_files_handled.append( missing_tail )
- break
- # Reload the tool into the local list of repository_tools_tups.
- repository_tool = trans.app.toolbox.load_tool( os.path.join( tool_path, tup_path ) )
- repository_tools_tups[ index ] = ( tup_path, repository_tool )
- return repository_tools_tups
- def __handle_tool_dependencies( self, current_working_dir, repo_files_dir, repository_tools_tups ):
- # Inspect each tool to see if it includes a "requirement" that refers to a fabric
- # script. For those that do, execute the fabric script to install tool dependencies.
- for index, repository_tools_tup in enumerate( repository_tools_tups ):
- tup_path, repository_tool = repository_tools_tup
- for requirement in repository_tool.requirements:
- if requirement.type == 'fabfile':
- log.debug( 'Executing fabric script to install dependencies for tool "%s"...' % repository_tool.name )
- fabfile = requirement.fabfile
- method = requirement.method
- # Find the relative path to the fabfile.
- relative_fabfile_path = None
- for root, dirs, files in os.walk( repo_files_dir ):
- for name in files:
- if name == fabfile:
- relative_fabfile_path = os.path.join( root, name )
- break
- if relative_fabfile_path:
- # cmd will look something like: fab -f fabfile.py install_bowtie
- cmd = 'fab -f %s %s' % ( relative_fabfile_path, method )
- tmp_name = tempfile.NamedTemporaryFile().name
- tmp_stderr = open( tmp_name, 'wb' )
- os.chdir( repo_files_dir )
- proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
- returncode = proc.wait()
- os.chdir( current_working_dir )
- tmp_stderr.close()
- if returncode != 0:
- # TODO: do something more here than logging the problem.
- tmp_stderr = open( tmp_name, 'rb' )
- error = tmp_stderr.read()
- tmp_stderr.close()
- log.debug( 'Problem installing dependencies for tool "%s"\n%s' % ( repository_tool.name, error ) )
- def __load_datatypes( self, trans, repo_files_dir ):
- # Find datatypes_conf.xml if it exists.
- datatypes_config = None
- for root, dirs, files in os.walk( repo_files_dir ):
- if root.find( '.hg' ) < 0:
- for name in files:
- if name == 'datatypes_conf.xml':
- datatypes_config = os.path.abspath( os.path.join( root, name ) )
- break
- if datatypes_config:
- imported_module = None
- # Parse datatypes_config.
- tree = parse_xml( datatypes_config )
- datatypes_config_root = tree.getroot()
- relative_path_to_datatype_file_name = None
- datatype_files = datatypes_config_root.find( 'datatype_files' )
- # Currently only a single datatype_file is supported. For example:
- # <datatype_files>
- # <datatype_file name="gmap.py"/>
- # </datatype_files>
- for elem in datatype_files.findall( 'datatype_file' ):
- datatype_file_name = elem.get( 'name', None )
- if datatype_file_name:
- # Find the file in the installed repository.
- for root, dirs, files in os.walk( repo_files_dir ):
- if root.find( '.hg' ) < 0:
- for name in files:
- if name == datatype_file_name:
- relative_path_to_datatype_file_name = os.path.join( root, name )
- break
- break
- if relative_path_to_datatype_file_name:
- relative_head, relative_tail = os.path.split( relative_path_to_datatype_file_name )
- registration = datatypes_config_root.find( 'registration' )
- # Get the module by parsing the <datatype> tag.
- for elem in registration.findall( 'datatype' ):
- # A 'type' attribute is currently required. The attribute
- # should be something like: type="gmap:GmapDB".
- dtype = elem.get( 'type', None )
- if dtype:
- fields = dtype.split( ':' )
- datatype_module = fields[0]
- datatype_class_name = fields[1]
- # Since we currently support only a single datatype_file,
- # we have what we need.
- break
- try:
- sys.path.insert( 0, relative_head )
- imported_module = __import__( datatype_module )
- sys.path.pop( 0 )
- except Exception, e:
- log.debug( "Exception importing datatypes code file included in installed repository: %s" % str( e ) )
- trans.app.datatypes_registry.load_datatypes( root_dir=trans.app.config.root, config=datatypes_config, imported_module=imported_module )
- def __get_repository_tools_and_sample_files( self, trans, tool_path, repo_files_dir ):
- # The sample_files list contains all files whose name ends in .sample
- sample_files = []
- # Find all special .sample files first.
- for root, dirs, files in os.walk( repo_files_dir ):
- if root.find( '.hg' ) < 0:
- for name in files:
- if name.endswith( '.sample' ):
- sample_files.append( os.path.abspath( os.path.join( root, name ) ) )
- # The repository_tools_tups list contains tuples of ( relative_path_to_tool_config, tool ) pairs
- repository_tools_tups = []
- for root, dirs, files in os.walk( repo_files_dir ):
- if root.find( '.hg' ) < 0 and root.find( 'hgrc' ) < 0:
- if '.hg' in dirs:
- dirs.remove( '.hg' )
- if 'hgrc' in files:
- files.remove( 'hgrc' )
- for name in files:
- # Find all tool configs.
- if name.endswith( '.xml' ):
- relative_path = os.path.join( root, name )
- full_path = os.path.abspath( os.path.join( root, name ) )
- try:
- repository_tool = trans.app.toolbox.load_tool( full_path )
- if repository_tool:
- # At this point, we need to lstrip tool_path from relative_path.
- tup_path = relative_path.replace( tool_path, '' ).lstrip( '/' )
- repository_tools_tups.append( ( tup_path, repository_tool ) )
- except Exception, e:
- # We have an invalid .xml file, so not a tool config.
- log.debug( "Ignoring invalid tool config (%s). Error: %s" % ( str( relative_path ), str( e ) ) )
- return sample_files, repository_tools_tups
- def __create_or_undelete_tool_shed_repository( self, trans, name, description, changeset_revision, repository_clone_url ):
- tmp_url = self.__clean_repository_clone_url( repository_clone_url )
- tool_shed = tmp_url.split( 'repos' )[ 0 ].rstrip( '/' )
- owner = self.__get_repository_owner( tmp_url )
- flush_needed = False
- tool_shed_repository = get_repository_by_shed_name_owner_changeset_revision( trans, tool_shed, name, owner, changeset_revision )
- if tool_shed_repository:
- if tool_shed_repository.deleted:
- tool_shed_repository.deleted = False
- flush_needed = True
- else:
- tool_shed_repository = trans.model.ToolShedRepository( tool_shed=tool_shed,
- name=name,
- description=description,
- owner=owner,
- changeset_revision=changeset_revision )
- flush_needed = True
- if flush_needed:
- trans.sa_session.add( tool_shed_repository )
- trans.sa_session.flush()
- def __add_shed_tool_conf_entry( self, trans, shed_tool_conf, new_tool_section ):
- # Add an entry in the shed_tool_conf file. An entry looks something like:
- # <section name="Filter and Sort" id="filter">
- # <tool file="filter/filtering.xml" guid="toolshed.g2.bx.psu.edu/repos/test/filter/1.0.2"/>
- # </section>
- # Make a backup of the hgweb.config file since we're going to be changing it.
- if not os.path.exists( shed_tool_conf ):
- output = open( shed_tool_conf, 'w' )
- output.write( '<?xml version="1.0"?>\n' )
- output.write( '<toolbox tool_path="%s">\n' % tool_path )
- output.write( '</toolbox>\n' )
- output.close()
- self.__make_shed_tool_conf_copy( trans, shed_tool_conf )
- tmp_fd, tmp_fname = tempfile.mkstemp()
- new_shed_tool_conf = open( tmp_fname, 'wb' )
- for i, line in enumerate( open( shed_tool_conf ) ):
- if line.startswith( '</toolbox>' ):
- # We're at the end of the original config file, so add our entry.
- new_shed_tool_conf.write( new_tool_section )
- new_shed_tool_conf.write( line )
- else:
- new_shed_tool_conf.write( line )
- new_shed_tool_conf.close()
- shutil.move( tmp_fname, os.path.abspath( shed_tool_conf ) )
- def __make_shed_tool_conf_copy( self, trans, shed_tool_conf ):
- # Make a backup of the shed_tool_conf file.
- today = date.today()
- backup_date = today.strftime( "%Y_%m_%d" )
- shed_tool_conf_copy = '%s/%s_%s_backup' % ( trans.app.config.root, shed_tool_conf, backup_date )
- shutil.copy( os.path.abspath( shed_tool_conf ), os.path.abspath( shed_tool_conf_copy ) )
- def __clean_tool_shed_url( self, tool_shed_url ):
- if tool_shed_url.find( ':' ) > 0:
- # Eliminate the port, if any, since it will result in an invalid directory name.
- return tool_shed_url.split( ':' )[ 0 ]
- return tool_shed_url.rstrip( '/' )
- def __clean_repository_clone_url( self, repository_clone_url ):
- if repository_clone_url.find( '@' ) > 0:
- # We have an url that includes an authenticated user, something like:
- # http://test@bx.psu.edu:9009/repos/some_username/column
- items = repository_clone_url.split( '@' )
- tmp_url = items[ 1 ]
- elif repository_clone_url.find( '//' ) > 0:
- # We have an url that includes only a protocol, something like:
- # http://bx.psu.edu:9009/repos/some_username/column
- items = repository_clone_url.split( '//' )
- tmp_url = items[ 1 ]
- else:
- tmp_url = repository_clone_url
- return tmp_url
- def __get_repository_owner( self, cleaned_repository_url ):
- items = cleaned_repository_url.split( 'repos' )
- repo_path = items[ 1 ]
- return repo_path.lstrip( '/' ).split( '/' )[ 0 ]
- def __generate_tool_path( self, repository_clone_url, changeset_revision ):
- """
- Generate a tool path that guarantees repositories with the same name will always be installed
- in different directories. The tool path will be of the form:
- <tool shed url>/repos/<repository owner>/<repository name>/<changeset revision>
- http://test@bx.psu.edu:9009/repos/test/filter
- """
- tmp_url = self.__clean_repository_clone_url( repository_clone_url )
- # Now tmp_url is something like: bx.psu.edu:9009/repos/some_username/column
- items = tmp_url.split( 'repos' )
- tool_shed_url = items[ 0 ]
- repo_path = items[ 1 ]
- tool_shed_url = self.__clean_tool_shed_url( tool_shed_url )
- return '%s/repos%s/%s' % ( tool_shed_url, repo_path, changeset_revision )
- def __generate_tool_guid( self, repository_clone_url, tool ):
- """
- Generate a guid for the installed tool. It is critical that this guid matches the guid for
- the tool in the Galaxy tool shed from which it is being installed. The form of the guid is
- <tool shed host>/repos/<repository owner>/<repository name>/<tool id>/<tool version>
- """
- tmp_url = self.__clean_repository_clone_url( repository_clone_url )
- return '%s/%s/%s' % ( tmp_url, tool.id, tool.version )
- def __generate_tool_panel_section( self, repository_name, repository_clone_url, changeset_revision, tool_section, repository_tools_tups ):
- """
- Write an in-memory tool panel section so we can load it into the tool panel and then
- append it to the appropriate shed tool config.
- TODO: re-write using ElementTree.
- """
- tmp_url = self.__clean_repository_clone_url( repository_clone_url )
- section_str = ''
- section_str += ' <section name="%s" id="%s">\n' % ( tool_section.name, tool_section.id )
- for repository_tool_tup in repository_tools_tups:
- tool_file_path, tool = repository_tool_tup
- guid = self.__generate_tool_guid( repository_clone_url, tool )
- section_str += ' <tool file="%s" guid="%s">\n' % ( tool_file_path, guid )
- section_str += ' <tool_shed>%s</tool_shed>\n' % tmp_url.split( 'repos' )[ 0 ].rstrip( '/' )
- section_str += ' <repository_name>%s</repository_name>\n' % repository_name
- section_str += ' <repository_owner>%s</repository_owner>\n' % self.__get_repository_owner( tmp_url )
- section_str += ' <changeset_revision>%s</changeset_revision>\n' % changeset_revision
- section_str += ' <id>%s</id>\n' % tool.id
- section_str += ' <version>%s</version>\n' % tool.version
- section_str += ' </tool>\n'
- section_str += ' </section>\n'
- return section_str
-
-## ---- Utility methods -------------------------------------------------------
-
-def build_shed_tool_conf_select_field( trans ):
- """Build a SelectField whose options are the keys in trans.app.toolbox.shed_tool_confs."""
- options = []
- for shed_tool_conf_filename, tool_path in trans.app.toolbox.shed_tool_confs.items():
- options.append( ( shed_tool_conf_filename.lstrip( './' ), shed_tool_conf_filename ) )
- select_field = SelectField( name='shed_tool_conf' )
- for option_tup in options:
- select_field.add_option( option_tup[0], option_tup[1] )
- return select_field
-def build_tool_panel_section_select_field( trans ):
- """Build a SelectField whose options are the sections of the current in-memory toolbox."""
- options = []
- for k, tool_section in trans.app.toolbox.tool_panel.items():
- options.append( ( tool_section.name, tool_section.id ) )
- select_field = SelectField( name='tool_panel_section', display='radio' )
- for option_tup in options:
- select_field.add_option( option_tup[0], option_tup[1] )
- return select_field
-def get_repository( trans, id ):
- """Get a tool_shed_repository from the database via id"""
- return trans.sa_session.query( trans.model.ToolShedRepository ).get( trans.security.decode_id( id ) )
-def get_repository_by_name_owner_changeset_revision( trans, name, owner, changeset_revision ):
- """Get a repository from the database via name owner and changeset_revision"""
- return trans.sa_session.query( trans.model.ToolShedRepository ) \
- .filter( and_( trans.model.ToolShedRepository.table.c.name == name,
- trans.model.ToolShedRepository.table.c.owner == owner,
- trans.model.ToolShedRepository.table.c.changeset_revision == changeset_revision ) ) \
- .first()
-def get_repository_by_shed_name_owner_changeset_revision( trans, tool_shed, name, owner, changeset_revision ):
- if tool_shed.find( '//' ) > 0:
- tool_shed = tool_shed.split( '//' )[1]
- return trans.sa_session.query( trans.model.ToolShedRepository ) \
- .filter( and_( trans.model.ToolShedRepository.table.c.tool_shed == tool_shed,
- trans.model.ToolShedRepository.table.c.name == name,
- trans.model.ToolShedRepository.table.c.owner == owner,
- trans.model.ToolShedRepository.table.c.changeset_revision == changeset_revision ) ) \
- .first()
-def get_url_from_repository_tool_shed( trans, repository ):
- # The stored value of repository.tool_shed is something like:
- # toolshed.g2.bx.psu.edu
- # We need the URL to this tool shed, which is something like:
- # http://toolshed.g2.bx.psu.edu/
- for shed_name, shed_url in trans.app.tool_shed_registry.tool_sheds.items():
- if shed_url.find( repository.tool_shed ) >= 0:
- if shed_url.endswith( '/' ):
- shed_url = shed_url.rstrip( '/' )
- return shed_url
- # The tool shed from which the repository was originally
- # installed must no longer be configured in tool_sheds_conf.xml.
- return None
-
\ No newline at end of file
+
\ No newline at end of file
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 lib/galaxy/web/controllers/admin_toolshed.py
--- /dev/null
+++ b/lib/galaxy/web/controllers/admin_toolshed.py
@@ -0,0 +1,889 @@
+from galaxy.web.controllers.admin import *
+import logging
+
+log = logging.getLogger( __name__ )
+
+class RepositoryListGrid( grids.Grid ):
+ class NameColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, tool_shed_repository ):
+ if tool_shed_repository.update_available:
+ return '<div class="count-box state-color-error">%s</div>' % tool_shed_repository.name
+ return tool_shed_repository.name
+ class DescriptionColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, tool_shed_repository ):
+ return tool_shed_repository.description
+ class OwnerColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, tool_shed_repository ):
+ return tool_shed_repository.owner
+ class RevisionColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, tool_shed_repository ):
+ return tool_shed_repository.changeset_revision
+ class ToolShedColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, tool_shed_repository ):
+ return tool_shed_repository.tool_shed
+ # Grid definition
+ title = "Installed tool shed repositories"
+ model_class = model.ToolShedRepository
+ template='/admin/tool_shed_repository/grid.mako'
+ default_sort_key = "name"
+ columns = [
+ NameColumn( "Name",
+ key="name",
+ link=( lambda item: dict( operation="manage_repository", id=item.id, webapp="galaxy" ) ),
+ attach_popup=True ),
+ DescriptionColumn( "Description" ),
+ OwnerColumn( "Owner" ),
+ RevisionColumn( "Revision" ),
+ ToolShedColumn( "Tool shed" ),
+ # Columns that are valid for filtering but are not visible.
+ grids.DeletedColumn( "Deleted",
+ key="deleted",
+ visible=False,
+ filterable="advanced" )
+ ]
+ columns.append( grids.MulticolFilterColumn( "Search repository name",
+ cols_to_filter=[ columns[0] ],
+ key="free-text-search",
+ visible=False,
+ filterable="standard" ) )
+ operations = [ grids.GridOperation( "Get updates",
+ allow_multiple=False,
+ condition=( lambda item: not item.deleted ),
+ async_compatible=False ) ]
+ standard_filters = []
+ default_filter = dict( deleted="False" )
+ num_rows_per_page = 50
+ preserve_state = False
+ use_paging = True
+ def build_initial_query( self, trans, **kwd ):
+ return trans.sa_session.query( self.model_class ) \
+ .filter( self.model_class.table.c.deleted == False )
+
+class AdminToolshed( AdminGalaxy ):
+
+ repository_list_grid = RepositoryListGrid()
+
+ @web.expose
+ @web.require_admin
+ def browse_repository( self, trans, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ repository = get_repository( trans, kwd[ 'id' ] )
+ return trans.fill_template( '/admin/tool_shed_repository/browse_repository.mako',
+ repository=repository,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def browse_repositories( self, trans, **kwd ):
+ if 'operation' in kwd:
+ operation = kwd.pop( 'operation' ).lower()
+ if operation == "manage_repository":
+ return self.manage_repository( trans, **kwd )
+ if operation == "get updates":
+ return self.check_for_updates( trans, **kwd )
+ kwd[ 'message' ] = 'Names of repositories for which updates are available are highlighted in red.'
+ return self.repository_list_grid( trans, **kwd )
+ @web.expose
+ @web.require_admin
+ def browse_tool_sheds( self, trans, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ return trans.fill_template( '/webapps/galaxy/admin/tool_sheds.mako',
+ webapp='galaxy',
+ message=message,
+ status='error' )
+ @web.expose
+ @web.require_admin
+ def find_tools_in_tool_shed( self, trans, **kwd ):
+ tool_shed_url = kwd[ 'tool_shed_url' ]
+ galaxy_url = url_for( '', qualified=True )
+ url = '%s/repository/find_tools?galaxy_url=%s&webapp=galaxy' % ( tool_shed_url, galaxy_url )
+ return trans.response.send_redirect( url )
+ @web.expose
+ @web.require_admin
+ def find_workflows_in_tool_shed( self, trans, **kwd ):
+ tool_shed_url = kwd[ 'tool_shed_url' ]
+ galaxy_url = url_for( '', qualified=True )
+ url = '%s/repository/find_workflows?galaxy_url=%s&webapp=galaxy' % ( tool_shed_url, galaxy_url )
+ return trans.response.send_redirect( url )
+ @web.expose
+ @web.require_admin
+ def browse_tool_shed( self, trans, **kwd ):
+ tool_shed_url = kwd[ 'tool_shed_url' ]
+ galaxy_url = url_for( '', qualified=True )
+ url = '%s/repository/browse_valid_repositories?galaxy_url=%s&webapp=galaxy' % ( tool_shed_url, galaxy_url )
+ return trans.response.send_redirect( url )
+ @web.expose
+ @web.require_admin
+ def install_repository( self, trans, **kwd ):
+ if not trans.app.toolbox.shed_tool_confs:
+ message = 'The <b>tool_config_file</b> setting in <b>universe_wsgi.ini</b> must include at least one shed tool configuration file name with a '
+ message += '<b><toolbox></b> tag that includes a <b>tool_path</b> attribute value which is a directory relative to the Galaxy installation '
+ message += 'directory in order to automatically install tools from a Galaxy tool shed (e.g., the file name <b>shed_tool_conf.xml</b> whose '
+ message += '<b><toolbox></b> tag is <b><toolbox tool_path="../shed_tools"></b>).<p/>See the '
+ message += '<a href="http://wiki.g2.bx.psu.edu/Tool%20Shed#Automatic_installation_of_Galaxy_tool…" '
+ message += 'target="_blank">Automatic installation of Galaxy tool shed repository tools into a local Galaxy instance</a> section of the '
+ message += '<a href="http://wiki.g2.bx.psu.edu/Tool%20Shed" target="_blank">Galaxy tool shed wiki</a> for all of the details.'
+ return trans.show_error_message( message )
+ message = kwd.get( 'message', '' )
+ status = kwd.get( 'status', 'done' )
+ tool_shed_url = kwd[ 'tool_shed_url' ]
+ repo_info_dict = kwd[ 'repo_info_dict' ]
+ new_tool_panel_section = kwd.get( 'new_tool_panel_section', '' )
+ tool_panel_section = kwd.get( 'tool_panel_section', '' )
+ if kwd.get( 'select_tool_panel_section_button', False ):
+ shed_tool_conf = kwd[ 'shed_tool_conf' ]
+ # Get the tool path.
+ for k, tool_path in trans.app.toolbox.shed_tool_confs.items():
+ if k == shed_tool_conf:
+ break
+ if new_tool_panel_section or tool_panel_section:
+ if new_tool_panel_section:
+ section_id = new_tool_panel_section.lower().replace( ' ', '_' )
+ new_section_key = 'section_%s' % str( section_id )
+ if new_section_key in trans.app.toolbox.tool_panel:
+ # Appending a tool to an existing section in trans.app.toolbox.tool_panel
+ log.debug( "Appending to tool panel section: %s" % new_tool_panel_section )
+ tool_section = trans.app.toolbox.tool_panel[ new_section_key ]
+ else:
+ # Appending a new section to trans.app.toolbox.tool_panel
+ log.debug( "Loading new tool panel section: %s" % new_tool_panel_section )
+ elem = Element( 'section' )
+ elem.attrib[ 'name' ] = new_tool_panel_section
+ elem.attrib[ 'id' ] = section_id
+ tool_section = ToolSection( elem )
+ trans.app.toolbox.tool_panel[ new_section_key ] = tool_section
+ else:
+ section_key = 'section_%s' % tool_panel_section
+ tool_section = trans.app.toolbox.tool_panel[ section_key ]
+ # Decode the encoded repo_info_dict param value.
+ repo_info_dict = tool_shed_decode( repo_info_dict )
+ # Clone the repository to the configured location.
+ current_working_dir = os.getcwd()
+ installed_repository_names = []
+ for name, repo_info_tuple in repo_info_dict.items():
+ metadata_dict = None
+ description, repository_clone_url, changeset_revision = repo_info_tuple
+ clone_dir = os.path.join( tool_path, self.__generate_tool_path( repository_clone_url, changeset_revision ) )
+ if os.path.exists( clone_dir ):
+ # Repository and revision has already been cloned.
+ # TODO: implement the ability to re-install or revert an existing repository.
+ message += 'Revision <b>%s</b> of repository <b>%s</b> was previously installed.<br/>' % ( changeset_revision, name )
+ else:
+ os.makedirs( clone_dir )
+ log.debug( 'Cloning %s...' % repository_clone_url )
+ cmd = 'hg clone %s' % repository_clone_url
+ tmp_name = tempfile.NamedTemporaryFile().name
+ tmp_stderr = open( tmp_name, 'wb' )
+ os.chdir( clone_dir )
+ proc = subprocess.Popen( args=cmd, shell=True, stderr=tmp_stderr.fileno() )
+ returncode = proc.wait()
+ os.chdir( current_working_dir )
+ tmp_stderr.close()
+ if returncode == 0:
+ # Update the cloned repository to changeset_revision. It is imperative that the
+ # installed repository is updated to the desired changeset_revision before metadata
+ # is set because the process for setting metadata uses the repository files on disk.
+ relative_install_dir = os.path.join( clone_dir, name )
+ log.debug( 'Updating cloned repository to revision "%s"...' % changeset_revision )
+ cmd = 'hg update -r %s' % changeset_revision
+ tmp_name = tempfile.NamedTemporaryFile().name
+ tmp_stderr = open( tmp_name, 'wb' )
+ os.chdir( relative_install_dir )
+ proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
+ returncode = proc.wait()
+ os.chdir( current_working_dir )
+ tmp_stderr.close()
+ if returncode == 0:
+ # Generate the metadata for the installed tool shed repository. It is imperative that
+ # the installed repository is updated to the desired changeset_revision before metadata
+ # is set because the process for setting metadata uses the repository files on disk.
+ metadata_dict = self.__generate_metadata( trans, relative_install_dir, repository_clone_url )
+ if 'datatypes_config' in metadata_dict:
+ datatypes_config = os.path.abspath( metadata_dict[ 'datatypes_config' ] )
+ # Load data types required by tools.
+ self.__load_datatypes( trans, datatypes_config, relative_install_dir )
+ if 'tools' in metadata_dict:
+ repository_tools_tups = []
+ for tool_dict in metadata_dict[ 'tools' ]:
+ relative_path = tool_dict[ 'tool_config' ]
+ tool = trans.app.toolbox.load_tool( os.path.abspath( relative_path ) )
+ repository_tools_tups.append( ( relative_path, tool ) )
+ if repository_tools_tups:
+ sample_files = metadata_dict.get( 'sample_files', [] )
+ # Handle missing data table entries for tool parameters that are dynamically generated select lists.
+ repository_tools_tups = self.__handle_missing_data_table_entry( trans, tool_path, sample_files, repository_tools_tups )
+ # Handle missing index files for tool parameters that are dynamically generated select lists.
+ repository_tools_tups = self.__handle_missing_index_file( trans, tool_path, sample_files, repository_tools_tups )
+ # Handle tools that use fabric scripts to install dependencies.
+ self.__handle_tool_dependencies( current_working_dir, relative_install_dir, repository_tools_tups )
+ # Generate an in-memory tool conf section that includes the new tools.
+ new_tool_section = self.__generate_tool_panel_section( name,
+ repository_clone_url,
+ changeset_revision,
+ tool_section,
+ repository_tools_tups )
+ # Create a temporary file to persist the in-memory tool section
+ # TODO: Figure out how to do this in-memory using xml.etree.
+ tmp_name = tempfile.NamedTemporaryFile().name
+ persisted_new_tool_section = open( tmp_name, 'wb' )
+ persisted_new_tool_section.write( new_tool_section )
+ persisted_new_tool_section.close()
+ # Parse the persisted tool panel section
+ tree = parse_xml( tmp_name )
+ root = tree.getroot()
+ # Load the tools in the section into the tool panel.
+ trans.app.toolbox.load_section_tag_set( root, trans.app.toolbox.tool_panel, tool_path )
+ # Remove the temporary file
+ try:
+ os.unlink( tmp_name )
+ except:
+ pass
+ # Append the new section to the shed_tool_config file.
+ self.__add_shed_tool_conf_entry( trans, shed_tool_conf, new_tool_section )
+ if trans.app.toolbox_search.enabled:
+ # If search support for tools is enabled, index the new installed tools.
+ trans.app.toolbox_search = ToolBoxSearch( trans.app.toolbox )
+ # Add a new record to the tool_shed_repository table if one doesn't
+ # already exist. If one exists but is marked deleted, undelete it.
+ self.__create_or_undelete_tool_shed_repository( trans,
+ name,
+ description,
+ changeset_revision,
+ repository_clone_url,
+ metadata_dict )
+ installed_repository_names.append( name )
+ else:
+ tmp_stderr = open( tmp_name, 'rb' )
+ message += '%s<br/>' % tmp_stderr.read()
+ tmp_stderr.close()
+ status = 'error'
+ else:
+ tmp_stderr = open( tmp_name, 'rb' )
+ message += '%s<br/>' % tmp_stderr.read()
+ tmp_stderr.close()
+ status = 'error'
+ if installed_repository_names:
+ installed_repository_names.sort()
+ num_repositories_installed = len( installed_repository_names )
+ message += 'Installed %d %s and all tools were loaded into tool panel section <b>%s</b>:<br/>Installed repositories: ' % \
+ ( num_repositories_installed, inflector.cond_plural( num_repositories_installed, 'repository' ), tool_section.name )
+ for i, repo_name in enumerate( installed_repository_names ):
+ if i == len( installed_repository_names ) -1:
+ message += '%s.<br/>' % repo_name
+ else:
+ message += '%s, ' % repo_name
+ return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
+ action='browse_repositories',
+ message=message,
+ status=status ) )
+ else:
+ message = 'Choose the section in your tool panel to contain the installed tools.'
+ status = 'error'
+ if len( trans.app.toolbox.shed_tool_confs.keys() ) > 1:
+ shed_tool_conf_select_field = build_shed_tool_conf_select_field( trans )
+ shed_tool_conf = None
+ else:
+ shed_tool_conf = trans.app.toolbox.shed_tool_confs.keys()[0].lstrip( './' )
+ shed_tool_conf_select_field = None
+ tool_panel_section_select_field = build_tool_panel_section_select_field( trans )
+ return trans.fill_template( '/admin/tool_shed_repository/select_tool_panel_section.mako',
+ tool_shed_url=tool_shed_url,
+ repo_info_dict=repo_info_dict,
+ shed_tool_conf=shed_tool_conf,
+ shed_tool_conf_select_field=shed_tool_conf_select_field,
+ tool_panel_section_select_field=tool_panel_section_select_field,
+ new_tool_panel_section=new_tool_panel_section,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def manage_repository( self, trans, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ repository = get_repository( trans, kwd[ 'id' ] )
+ description = util.restore_text( params.get( 'description', repository.description ) )
+ relative_install_dir = self.__get_relative_install_dir( trans, repository )
+ repo_files_dir = os.path.abspath( os.path.join( relative_install_dir, repository.name ) )
+ if params.get( 'edit_repository_button', False ):
+ if description != repository.description:
+ repository.description = description
+ trans.sa_session.add( repository )
+ trans.sa_session.flush()
+ message = "The repository information has been updated."
+ elif params.get( 'set_metadata_button', False ):
+ repository_clone_url = self.__generate_clone_url( trans, repository )
+ metadata_dict = self.__generate_metadata( trans, relative_install_dir, repository_clone_url )
+ if metadata_dict:
+ repository.metadata = metadata_dict
+ trans.sa_session.add( repository )
+ trans.sa_session.flush()
+ message = "Repository metadata has been reset."
+ return trans.fill_template( '/admin/tool_shed_repository/manage_repository.mako',
+ repository=repository,
+ description=description,
+ repo_files_dir=repo_files_dir,
+ message=message,
+ status=status )
+ @web.expose
+ @web.require_admin
+ def check_for_updates( self, trans, **kwd ):
+ # Send a request to the relevant tool shed to see if there are any updates.
+ repository = get_repository( trans, kwd[ 'id' ] )
+ tool_shed_url = get_url_from_repository_tool_shed( trans, repository )
+ url = '%s/repository/check_for_updates?galaxy_url=%s&name=%s&owner=%s&changeset_revision=%s&webapp=galaxy' % \
+ ( tool_shed_url, url_for( '', qualified=True ), repository.name, repository.owner, repository.changeset_revision )
+ return trans.response.send_redirect( url )
+ @web.expose
+ @web.require_admin
+ def update_to_changeset_revision( self, trans, **kwd ):
+ """Update a cloned repository to the latest revision possible."""
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ tool_shed_url = kwd[ 'tool_shed_url' ]
+ name = params.get( 'name', None )
+ owner = params.get( 'owner', None )
+ changeset_revision = params.get( 'changeset_revision', None )
+ latest_changeset_revision = params.get( 'latest_changeset_revision', None )
+ repository = get_repository_by_shed_name_owner_changeset_revision( trans, tool_shed_url, name, owner, changeset_revision )
+ if changeset_revision and latest_changeset_revision:
+ if changeset_revision == latest_changeset_revision:
+ message = "The cloned tool shed repository named '%s' is current (there are no updates available)." % name
+ else:
+ current_working_dir = os.getcwd()
+ relative_install_dir = self.__get_relative_install_dir( trans, repository )
+ if relative_install_dir:
+ # Update the cloned repository to changeset_revision.
+ repo_files_dir = os.path.join( relative_install_dir, name )
+ log.debug( "Updating cloned repository named '%s' from revision '%s' to revision '%s'..." % \
+ ( name, changeset_revision, latest_changeset_revision ) )
+ cmd = 'hg pull'
+ tmp_name = tempfile.NamedTemporaryFile().name
+ tmp_stderr = open( tmp_name, 'wb' )
+ os.chdir( repo_files_dir )
+ proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
+ returncode = proc.wait()
+ os.chdir( current_working_dir )
+ tmp_stderr.close()
+ if returncode == 0:
+ cmd = 'hg update -r %s' % latest_changeset_revision
+ tmp_name = tempfile.NamedTemporaryFile().name
+ tmp_stderr = open( tmp_name, 'wb' )
+ os.chdir( repo_files_dir )
+ proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
+ returncode = proc.wait()
+ os.chdir( current_working_dir )
+ tmp_stderr.close()
+ if returncode == 0:
+ # Update the repository changeset_revision in the database.
+ repository.changeset_revision = latest_changeset_revision
+ trans.sa_session.add( repository )
+ trans.sa_session.flush()
+ message = "The cloned repository named '%s' has been updated to change set revision '%s'." % \
+ ( name, latest_changeset_revision )
+ else:
+ tmp_stderr = open( tmp_name, 'rb' )
+ message = tmp_stderr.read()
+ tmp_stderr.close()
+ status = 'error'
+ else:
+ tmp_stderr = open( tmp_name, 'rb' )
+ message = tmp_stderr.read()
+ tmp_stderr.close()
+ status = 'error'
+ else:
+ message = "The directory containing the cloned repository named '%s' cannot be found." % name
+ status = 'error'
+ else:
+ message = "The latest changeset revision could not be retrieved for the repository named '%s'." % name
+ status = 'error'
+ return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
+ action='manage_repository',
+ id=trans.security.encode_id( repository.id ),
+ message=message,
+ status=status ) )
+ @web.expose
+ @web.require_admin
+ def view_tool_metadata( self, trans, repository_id, tool_id, **kwd ):
+ params = util.Params( kwd )
+ message = util.restore_text( params.get( 'message', '' ) )
+ status = params.get( 'status', 'done' )
+ webapp = params.get( 'webapp', 'community' )
+ repository = get_repository( trans, repository_id )
+ metadata = {}
+ tool = None
+ if 'tools' in repository.metadata:
+ for tool_metadata_dict in repository.metadata[ 'tools' ]:
+ if tool_metadata_dict[ 'id' ] == tool_id:
+ metadata = tool_metadata_dict
+ tool = trans.app.toolbox.load_tool( os.path.abspath( metadata[ 'tool_config' ] ) )
+ break
+ return trans.fill_template( "/admin/tool_shed_repository/view_tool_metadata.mako",
+ repository=repository,
+ tool=tool,
+ metadata=metadata,
+ message=message,
+ status=status )
+ def __generate_metadata( self, trans, relative_install_dir, repository_clone_url ):
+ """
+ Browse the repository files on disk to generate metadata. Since we are using disk files, it
+ is imperative that the repository is updated to the desired change set revision before metadata
+ is generated.
+ """
+ metadata_dict = {}
+ sample_files = []
+ datatypes_config = None
+ # Find datatypes_conf.xml if it exists.
+ for root, dirs, files in os.walk( relative_install_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name == 'datatypes_conf.xml':
+ relative_path = os.path.join( root, name )
+ datatypes_config = os.path.abspath( relative_path )
+ break
+ if datatypes_config:
+ metadata_dict[ 'datatypes_config' ] = relative_path
+ metadata_dict = self.__generate_datatypes_metadata( trans, datatypes_config, metadata_dict )
+ # Find all special .sample files.
+ for root, dirs, files in os.walk( relative_install_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name.endswith( '.sample' ):
+ sample_files.append( os.path.join( root, name ) )
+ if sample_files:
+ metadata_dict[ 'sample_files' ] = sample_files
+ # Find all tool configs and exported workflows.
+ for root, dirs, files in os.walk( relative_install_dir ):
+ if root.find( '.hg' ) < 0 and root.find( 'hgrc' ) < 0:
+ if '.hg' in dirs:
+ dirs.remove( '.hg' )
+ for name in files:
+ # Find all tool configs.
+ if name != 'datatypes_conf.xml' and name.endswith( '.xml' ):
+ full_path = os.path.abspath( os.path.join( root, name ) )
+ try:
+ tool = trans.app.toolbox.load_tool( full_path )
+ except Exception, e:
+ tool = None
+ if tool is not None:
+ tool_config = os.path.join( root, name )
+ metadata_dict = self.__generate_tool_metadata( trans, tool_config, tool, repository_clone_url, metadata_dict )
+ # Find all exported workflows
+ elif name.endswith( '.ga' ):
+ relative_path = os.path.join( root, name )
+ fp = open( relative_path, 'rb' )
+ workflow_text = fp.read()
+ fp.close()
+ exported_workflow_dict = from_json_string( workflow_text )
+ if 'a_galaxy_workflow' in exported_workflow_dict and exported_workflow_dict[ 'a_galaxy_workflow' ] == 'true':
+ metadata_dict = self.__generate_workflow_metadata( trans, relative_path, exported_workflow_dict, metadata_dict )
+ return metadata_dict
+ def __generate_datatypes_metadata( self, trans, datatypes_config, metadata_dict ):
+ """
+ Update the received metadata_dict with changes that have been applied
+ to the received datatypes_config.
+ """
+ # Parse datatypes_config.
+ tree = ElementTree.parse( datatypes_config )
+ root = tree.getroot()
+ ElementInclude.include( root )
+ repository_datatype_code_files = []
+ datatype_files = root.find( 'datatype_files' )
+ if datatype_files:
+ for elem in datatype_files.findall( 'datatype_file' ):
+ name = elem.get( 'name', None )
+ repository_datatype_code_files.append( name )
+ metadata_dict[ 'datatype_files' ] = repository_datatype_code_files
+ datatypes = []
+ registration = root.find( 'registration' )
+ if registration:
+ for elem in registration.findall( 'datatype' ):
+ extension = elem.get( 'extension', None )
+ dtype = elem.get( 'type', None )
+ mimetype = elem.get( 'mimetype', None )
+ datatypes.append( dict( extension=extension,
+ dtype=dtype,
+ mimetype=mimetype ) )
+ metadata_dict[ 'datatypes' ] = datatypes
+ return metadata_dict
+ def __generate_tool_metadata( self, trans, tool_config, tool, repository_clone_url, metadata_dict ):
+ """
+ Update the received metadata_dict with changes that have been
+ applied to the received tool.
+ """
+ # Generate the guid
+ guid = self.__generate_tool_guid( repository_clone_url, tool )
+ # Handle tool.requirements.
+ tool_requirements = []
+ for tr in tool.requirements:
+ name=tr.name
+ type=tr.type
+ if type == 'fabfile':
+ version = None
+ fabfile = tr.fabfile
+ method = tr.method
+ else:
+ version = tr.version
+ fabfile = None
+ method = None
+ requirement_dict = dict( name=name,
+ type=type,
+ version=version,
+ fabfile=fabfile,
+ method=method )
+ tool_requirements.append( requirement_dict )
+ # Handle tool.tests.
+ tool_tests = []
+ if tool.tests:
+ for ttb in tool.tests:
+ test_dict = dict( name=ttb.name,
+ required_files=ttb.required_files,
+ inputs=ttb.inputs,
+ outputs=ttb.outputs )
+ tool_tests.append( test_dict )
+ tool_dict = dict( id=tool.id,
+ guid=guid,
+ name=tool.name,
+ version=tool.version,
+ description=tool.description,
+ version_string_cmd = tool.version_string_cmd,
+ tool_config=tool_config,
+ requirements=tool_requirements,
+ tests=tool_tests )
+ if 'tools' in metadata_dict:
+ metadata_dict[ 'tools' ].append( tool_dict )
+ else:
+ metadata_dict[ 'tools' ] = [ tool_dict ]
+ return metadata_dict
+ def __generate_workflow_metadata( self, trans, relative_path, exported_workflow_dict, metadata_dict ):
+ """
+ Update the received metadata_dict with changes that have been applied
+ to the received exported_workflow_dict. Store everything in the database.
+ """
+ if 'workflows' in metadata_dict:
+ metadata_dict[ 'workflows' ].append( ( relative_path, exported_workflow_dict ) )
+ else:
+ metadata_dict[ 'workflows' ] = [ ( relative_path, exported_workflow_dict ) ]
+ return metadata_dict
+ def __get_relative_install_dir( self, trans, repository ):
+ # Get the directory where the repository is install.
+ tool_shed = self.__clean_tool_shed_url( repository.tool_shed )
+ partial_install_dir = '%s/repos/%s/%s/%s' % ( tool_shed, repository.owner, repository.name, repository.changeset_revision )
+ # Get the relative tool installation paths from each of the shed tool configs.
+ shed_tool_confs = trans.app.toolbox.shed_tool_confs
+ relative_install_dir = None
+ # The shed_tool_confs dictionary contains { shed_conf_filename : tool_path } pairs.
+ for shed_conf_filename, tool_path in shed_tool_confs.items():
+ relative_install_dir = os.path.join( tool_path, partial_install_dir )
+ if os.path.isdir( relative_install_dir ):
+ break
+ return relative_install_dir
+ def __handle_missing_data_table_entry( self, trans, tool_path, sample_files, repository_tools_tups ):
+ # Inspect each tool to see if any have input parameters that are dynamically
+ # generated select lists that require entries in the tool_data_table_conf.xml file.
+ missing_data_table_entry = False
+ for index, repository_tools_tup in enumerate( repository_tools_tups ):
+ tup_path, repository_tool = repository_tools_tup
+ if repository_tool.params_with_missing_data_table_entry:
+ missing_data_table_entry = True
+ break
+ if missing_data_table_entry:
+ # The repository must contain a tool_data_table_conf.xml.sample file that includes
+ # all required entries for all tools in the repository.
+ for sample_file in sample_files:
+ head, tail = os.path.split( sample_file )
+ if tail == 'tool_data_table_conf.xml.sample':
+ break
+ error, correction_msg = handle_sample_tool_data_table_conf_file( trans, sample_file )
+ if error:
+ # TODO: Do more here than logging an exception.
+ log.debug( exception_msg )
+ # Reload the tool into the local list of repository_tools_tups.
+ repository_tool = trans.app.toolbox.load_tool( os.path.join( tool_path, tup_path ) )
+ repository_tools_tups[ index ] = ( tup_path, repository_tool )
+ return repository_tools_tups
+ def __handle_missing_index_file( self, trans, tool_path, sample_files, repository_tools_tups ):
+ # Inspect each tool to see if it has any input parameters that
+ # are dynamically generated select lists that depend on a .loc file.
+ missing_files_handled = []
+ for index, repository_tools_tup in enumerate( repository_tools_tups ):
+ tup_path, repository_tool = repository_tools_tup
+ params_with_missing_index_file = repository_tool.params_with_missing_index_file
+ for param in params_with_missing_index_file:
+ options = param.options
+ missing_head, missing_tail = os.path.split( options.missing_index_file )
+ if missing_tail not in missing_files_handled:
+ # The repository must contain the required xxx.loc.sample file.
+ for sample_file in sample_files:
+ sample_head, sample_tail = os.path.split( sample_file )
+ if sample_tail == '%s.sample' % missing_tail:
+ copy_sample_loc_file( trans, sample_file )
+ if options.tool_data_table and options.tool_data_table.missing_index_file:
+ options.tool_data_table.handle_found_index_file( options.missing_index_file )
+ missing_files_handled.append( missing_tail )
+ break
+ # Reload the tool into the local list of repository_tools_tups.
+ repository_tool = trans.app.toolbox.load_tool( os.path.join( tool_path, tup_path ) )
+ repository_tools_tups[ index ] = ( tup_path, repository_tool )
+ return repository_tools_tups
+ def __handle_tool_dependencies( self, current_working_dir, repo_files_dir, repository_tools_tups ):
+ # Inspect each tool to see if it includes a "requirement" that refers to a fabric
+ # script. For those that do, execute the fabric script to install tool dependencies.
+ for index, repository_tools_tup in enumerate( repository_tools_tups ):
+ tup_path, repository_tool = repository_tools_tup
+ for requirement in repository_tool.requirements:
+ if requirement.type == 'fabfile':
+ log.debug( 'Executing fabric script to install dependencies for tool "%s"...' % repository_tool.name )
+ fabfile = requirement.fabfile
+ method = requirement.method
+ # Find the relative path to the fabfile.
+ relative_fabfile_path = None
+ for root, dirs, files in os.walk( repo_files_dir ):
+ for name in files:
+ if name == fabfile:
+ relative_fabfile_path = os.path.join( root, name )
+ break
+ if relative_fabfile_path:
+ # cmd will look something like: fab -f fabfile.py install_bowtie
+ cmd = 'fab -f %s %s' % ( relative_fabfile_path, method )
+ tmp_name = tempfile.NamedTemporaryFile().name
+ tmp_stderr = open( tmp_name, 'wb' )
+ os.chdir( repo_files_dir )
+ proc = subprocess.Popen( cmd, shell=True, stderr=tmp_stderr.fileno() )
+ returncode = proc.wait()
+ os.chdir( current_working_dir )
+ tmp_stderr.close()
+ if returncode != 0:
+ # TODO: do something more here than logging the problem.
+ tmp_stderr = open( tmp_name, 'rb' )
+ error = tmp_stderr.read()
+ tmp_stderr.close()
+ log.debug( 'Problem installing dependencies for tool "%s"\n%s' % ( repository_tool.name, error ) )
+ def __load_datatypes( self, trans, datatypes_config, relative_intall_dir ):
+ imported_module = None
+ # Parse datatypes_config.
+ tree = parse_xml( datatypes_config )
+ datatypes_config_root = tree.getroot()
+ relative_path_to_datatype_file_name = None
+ datatype_files = datatypes_config_root.find( 'datatype_files' )
+ # Currently only a single datatype_file is supported. For example:
+ # <datatype_files>
+ # <datatype_file name="gmap.py"/>
+ # </datatype_files>
+ for elem in datatype_files.findall( 'datatype_file' ):
+ datatype_file_name = elem.get( 'name', None )
+ if datatype_file_name:
+ # Find the file in the installed repository.
+ for root, dirs, files in os.walk( relative_intall_dir ):
+ if root.find( '.hg' ) < 0:
+ for name in files:
+ if name == datatype_file_name:
+ relative_path_to_datatype_file_name = os.path.join( root, name )
+ break
+ break
+ if relative_path_to_datatype_file_name:
+ relative_head, relative_tail = os.path.split( relative_path_to_datatype_file_name )
+ registration = datatypes_config_root.find( 'registration' )
+ # Get the module by parsing the <datatype> tag.
+ for elem in registration.findall( 'datatype' ):
+ # A 'type' attribute is currently required. The attribute
+ # should be something like: type="gmap:GmapDB".
+ dtype = elem.get( 'type', None )
+ if dtype:
+ fields = dtype.split( ':' )
+ datatype_module = fields[0]
+ datatype_class_name = fields[1]
+ # Since we currently support only a single datatype_file,
+ # we have what we need.
+ break
+ try:
+ sys.path.insert( 0, relative_head )
+ imported_module = __import__( datatype_module )
+ sys.path.pop( 0 )
+ except Exception, e:
+ log.debug( "Exception importing datatypes code file included in installed repository: %s" % str( e ) )
+ trans.app.datatypes_registry.load_datatypes( root_dir=trans.app.config.root, config=datatypes_config, imported_module=imported_module )
+ def __create_or_undelete_tool_shed_repository( self, trans, name, description, changeset_revision, repository_clone_url, metadata_dict ):
+ tmp_url = self.__clean_repository_clone_url( repository_clone_url )
+ tool_shed = tmp_url.split( 'repos' )[ 0 ].rstrip( '/' )
+ owner = self.__get_repository_owner( tmp_url )
+ includes_datatypes = 'datatypes_config' in metadata_dict
+ flush_needed = False
+ tool_shed_repository = get_repository_by_shed_name_owner_changeset_revision( trans, tool_shed, name, owner, changeset_revision )
+ if tool_shed_repository:
+ if tool_shed_repository.deleted:
+ tool_shed_repository.deleted = False
+ # Reset includes_datatypes in case metadata changed since last installed.
+ tool_shed_repository.includes_datatypes = includes_datatypes
+ flush_needed = True
+ else:
+ tool_shed_repository = trans.model.ToolShedRepository( tool_shed=tool_shed,
+ name=name,
+ description=description,
+ owner=owner,
+ changeset_revision=changeset_revision,
+ metadata=metadata_dict,
+ includes_datatypes=includes_datatypes )
+ flush_needed = True
+ if flush_needed:
+ trans.sa_session.add( tool_shed_repository )
+ trans.sa_session.flush()
+ def __add_shed_tool_conf_entry( self, trans, shed_tool_conf, new_tool_section ):
+ # Add an entry in the shed_tool_conf file. An entry looks something like:
+ # <section name="Filter and Sort" id="filter">
+ # <tool file="filter/filtering.xml" guid="toolshed.g2.bx.psu.edu/repos/test/filter/1.0.2"/>
+ # </section>
+ # Make a backup of the hgweb.config file since we're going to be changing it.
+ if not os.path.exists( shed_tool_conf ):
+ output = open( shed_tool_conf, 'w' )
+ output.write( '<?xml version="1.0"?>\n' )
+ output.write( '<toolbox tool_path="%s">\n' % tool_path )
+ output.write( '</toolbox>\n' )
+ output.close()
+ self.__make_shed_tool_conf_copy( trans, shed_tool_conf )
+ tmp_fd, tmp_fname = tempfile.mkstemp()
+ new_shed_tool_conf = open( tmp_fname, 'wb' )
+ for i, line in enumerate( open( shed_tool_conf ) ):
+ if line.startswith( '</toolbox>' ):
+ # We're at the end of the original config file, so add our entry.
+ new_shed_tool_conf.write( new_tool_section )
+ new_shed_tool_conf.write( line )
+ else:
+ new_shed_tool_conf.write( line )
+ new_shed_tool_conf.close()
+ shutil.move( tmp_fname, os.path.abspath( shed_tool_conf ) )
+ def __make_shed_tool_conf_copy( self, trans, shed_tool_conf ):
+ # Make a backup of the shed_tool_conf file.
+ today = date.today()
+ backup_date = today.strftime( "%Y_%m_%d" )
+ shed_tool_conf_copy = '%s/%s_%s_backup' % ( trans.app.config.root, shed_tool_conf, backup_date )
+ shutil.copy( os.path.abspath( shed_tool_conf ), os.path.abspath( shed_tool_conf_copy ) )
+ def __clean_tool_shed_url( self, tool_shed_url ):
+ if tool_shed_url.find( ':' ) > 0:
+ # Eliminate the port, if any, since it will result in an invalid directory name.
+ return tool_shed_url.split( ':' )[ 0 ]
+ return tool_shed_url.rstrip( '/' )
+ def __clean_repository_clone_url( self, repository_clone_url ):
+ if repository_clone_url.find( '@' ) > 0:
+ # We have an url that includes an authenticated user, something like:
+ # http://test@bx.psu.edu:9009/repos/some_username/column
+ items = repository_clone_url.split( '@' )
+ tmp_url = items[ 1 ]
+ elif repository_clone_url.find( '//' ) > 0:
+ # We have an url that includes only a protocol, something like:
+ # http://bx.psu.edu:9009/repos/some_username/column
+ items = repository_clone_url.split( '//' )
+ tmp_url = items[ 1 ]
+ else:
+ tmp_url = repository_clone_url
+ return tmp_url
+ def __get_repository_owner( self, cleaned_repository_url ):
+ items = cleaned_repository_url.split( 'repos' )
+ repo_path = items[ 1 ]
+ return repo_path.lstrip( '/' ).split( '/' )[ 0 ]
+ def __generate_tool_path( self, repository_clone_url, changeset_revision ):
+ """
+ Generate a tool path that guarantees repositories with the same name will always be installed
+ in different directories. The tool path will be of the form:
+ <tool shed url>/repos/<repository owner>/<repository name>/<changeset revision>
+ http://test@bx.psu.edu:9009/repos/test/filter
+ """
+ tmp_url = self.__clean_repository_clone_url( repository_clone_url )
+ # Now tmp_url is something like: bx.psu.edu:9009/repos/some_username/column
+ items = tmp_url.split( 'repos' )
+ tool_shed_url = items[ 0 ]
+ repo_path = items[ 1 ]
+ tool_shed_url = self.__clean_tool_shed_url( tool_shed_url )
+ return '%s/repos%s/%s' % ( tool_shed_url, repo_path, changeset_revision )
+ def __generate_clone_url( self, trans, repository ):
+ """Generate the URL for cloning a repository."""
+ tool_shed_url = get_url_from_repository_tool_shed( trans, repository )
+ return '%s/repos/%s/%s' % ( tool_shed_url, repository.owner, repository.name )
+ def __generate_tool_guid( self, repository_clone_url, tool ):
+ """
+ Generate a guid for the installed tool. It is critical that this guid matches the guid for
+ the tool in the Galaxy tool shed from which it is being installed. The form of the guid is
+ <tool shed host>/repos/<repository owner>/<repository name>/<tool id>/<tool version>
+ """
+ tmp_url = self.__clean_repository_clone_url( repository_clone_url )
+ return '%s/%s/%s' % ( tmp_url, tool.id, tool.version )
+ def __generate_tool_panel_section( self, repository_name, repository_clone_url, changeset_revision, tool_section, repository_tools_tups ):
+ """
+ Write an in-memory tool panel section so we can load it into the tool panel and then
+ append it to the appropriate shed tool config.
+ TODO: re-write using ElementTree.
+ """
+ tmp_url = self.__clean_repository_clone_url( repository_clone_url )
+ section_str = ''
+ section_str += ' <section name="%s" id="%s">\n' % ( tool_section.name, tool_section.id )
+ for repository_tool_tup in repository_tools_tups:
+ tool_file_path, tool = repository_tool_tup
+ guid = self.__generate_tool_guid( repository_clone_url, tool )
+ section_str += ' <tool file="%s" guid="%s">\n' % ( tool_file_path, guid )
+ section_str += ' <tool_shed>%s</tool_shed>\n' % tmp_url.split( 'repos' )[ 0 ].rstrip( '/' )
+ section_str += ' <repository_name>%s</repository_name>\n' % repository_name
+ section_str += ' <repository_owner>%s</repository_owner>\n' % self.__get_repository_owner( tmp_url )
+ section_str += ' <changeset_revision>%s</changeset_revision>\n' % changeset_revision
+ section_str += ' <id>%s</id>\n' % tool.id
+ section_str += ' <version>%s</version>\n' % tool.version
+ section_str += ' </tool>\n'
+ section_str += ' </section>\n'
+ return section_str
+
+## ---- Utility methods -------------------------------------------------------
+
+def build_shed_tool_conf_select_field( trans ):
+ """Build a SelectField whose options are the keys in trans.app.toolbox.shed_tool_confs."""
+ options = []
+ for shed_tool_conf_filename, tool_path in trans.app.toolbox.shed_tool_confs.items():
+ options.append( ( shed_tool_conf_filename.lstrip( './' ), shed_tool_conf_filename ) )
+ select_field = SelectField( name='shed_tool_conf' )
+ for option_tup in options:
+ select_field.add_option( option_tup[0], option_tup[1] )
+ return select_field
+def build_tool_panel_section_select_field( trans ):
+ """Build a SelectField whose options are the sections of the current in-memory toolbox."""
+ options = []
+ for k, tool_section in trans.app.toolbox.tool_panel.items():
+ options.append( ( tool_section.name, tool_section.id ) )
+ select_field = SelectField( name='tool_panel_section', display='radio' )
+ for option_tup in options:
+ select_field.add_option( option_tup[0], option_tup[1] )
+ return select_field
+def get_repository( trans, id ):
+ """Get a tool_shed_repository from the database via id"""
+ return trans.sa_session.query( trans.model.ToolShedRepository ).get( trans.security.decode_id( id ) )
+def get_repository_by_name_owner_changeset_revision( trans, name, owner, changeset_revision ):
+ """Get a repository from the database via name owner and changeset_revision"""
+ return trans.sa_session.query( trans.model.ToolShedRepository ) \
+ .filter( and_( trans.model.ToolShedRepository.table.c.name == name,
+ trans.model.ToolShedRepository.table.c.owner == owner,
+ trans.model.ToolShedRepository.table.c.changeset_revision == changeset_revision ) ) \
+ .first()
+def get_repository_by_shed_name_owner_changeset_revision( trans, tool_shed, name, owner, changeset_revision ):
+ if tool_shed.find( '//' ) > 0:
+ tool_shed = tool_shed.split( '//' )[1]
+ return trans.sa_session.query( trans.model.ToolShedRepository ) \
+ .filter( and_( trans.model.ToolShedRepository.table.c.tool_shed == tool_shed,
+ trans.model.ToolShedRepository.table.c.name == name,
+ trans.model.ToolShedRepository.table.c.owner == owner,
+ trans.model.ToolShedRepository.table.c.changeset_revision == changeset_revision ) ) \
+ .first()
+def get_url_from_repository_tool_shed( trans, repository ):
+ # The stored value of repository.tool_shed is something like:
+ # toolshed.g2.bx.psu.edu
+ # We need the URL to this tool shed, which is something like:
+ # http://toolshed.g2.bx.psu.edu/
+ for shed_name, shed_url in trans.app.tool_shed_registry.tool_sheds.items():
+ if shed_url.find( repository.tool_shed ) >= 0:
+ if shed_url.endswith( '/' ):
+ shed_url = shed_url.rstrip( '/' )
+ return shed_url
+ # The tool shed from which the repository was originally
+ # installed must no longer be configured in tool_sheds_conf.xml.
+ return None
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 lib/galaxy/web/controllers/workflow.py
--- a/lib/galaxy/web/controllers/workflow.py
+++ b/lib/galaxy/web/controllers/workflow.py
@@ -1230,8 +1230,8 @@
elif installed_repository_file:
# The workflow was read from a file included with an installed tool shed repository.
message = "Workflow <b>%s</b> imported successfully." % workflow.name
- return trans.response.send_redirect( web.url_for( controller='admin',
- action='browse_tool_shed_repository',
+ return trans.response.send_redirect( web.url_for( controller='admin_toolshed',
+ action='browse_repository',
id=repository_id,
message=message,
status=status ) )
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -437,7 +437,7 @@
if operation == "install":
galaxy_url = trans.get_cookie( name='toolshedgalaxyurl' )
encoded_repo_info_dict = self.__encode_repo_info_dict( trans, webapp, util.listify( item_id ) )
- url = '%s/admin/install_tool_shed_repository?tool_shed_url=%s&webapp=%s&repo_info_dict=%s' % \
+ url = '%s/admin_toolshed/install_repository?tool_shed_url=%s&webapp=%s&repo_info_dict=%s' % \
( galaxy_url, url_for( '', qualified=True ), webapp, encoded_repo_info_dict )
return trans.response.send_redirect( url )
else:
@@ -513,7 +513,7 @@
if operation == "install":
galaxy_url = trans.get_cookie( name='toolshedgalaxyurl' )
encoded_repo_info_dict = self.__encode_repo_info_dict( trans, webapp, util.listify( item_id ) )
- url = '%s/admin/install_tool_shed_repository?tool_shed_url=%s&webapp=%s&repo_info_dict=%s' % \
+ url = '%s/admin_toolshed/install_repository?tool_shed_url=%s&webapp=%s&repo_info_dict=%s' % \
( galaxy_url, url_for( '', qualified=True ), webapp, encoded_repo_info_dict )
return trans.response.send_redirect( url )
else:
@@ -759,7 +759,7 @@
repo_info_dict[ repository.name ] = ( repository.description, repository_clone_url, changeset_revision )
encoded_repo_info_dict = encode( repo_info_dict )
# Redirect back to local Galaxy to perform install.
- url = '%s/admin/install_tool_shed_repository?tool_shed_url=%s&repo_info_dict=%s' % \
+ url = '%s/admin_toolshed/install_repository?tool_shed_url=%s&repo_info_dict=%s' % \
( galaxy_url, url_for( '', qualified=True ), encoded_repo_info_dict )
return trans.response.send_redirect( url )
@web.expose
@@ -773,7 +773,7 @@
changeset_revision = params.get( 'changeset_revision', None )
webapp = params.get( 'webapp', 'community' )
# Start building up the url to redirect back to the calling Galaxy instance.
- url = '%s/admin/update_to_changeset_revision?tool_shed_url=%s' % ( galaxy_url, url_for( '', qualified=True ) )
+ url = '%s/admin_toolshed/update_to_changeset_revision?tool_shed_url=%s' % ( galaxy_url, url_for( '', qualified=True ) )
repository = get_repository_by_name_and_owner( trans, name, owner )
url += '&name=%s&owner=%s&changeset_revision=%s&latest_changeset_revision=' % \
( repository.name, repository.user.username, changeset_revision )
@@ -1035,7 +1035,7 @@
hgweb_config = "%s/hgweb.config" % trans.app.config.root
if repository_path.startswith( './' ):
repository_path = repository_path.replace( './', '', 1 )
- entry = "repos/%s/%s = %s" % ( repository.user.username, repository.name, repository_path.lstrip( './' ) )
+ entry = "repos/%s/%s = %s" % ( repository.user.username, repository.name, repository_path )
tmp_fd, tmp_fname = tempfile.mkstemp()
if os.path.exists( hgweb_config ):
# Make a backup of the hgweb.config file since we're going to be changing it.
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/admin/select_tool_panel_section.mako
--- a/templates/admin/select_tool_panel_section.mako
+++ /dev/null
@@ -1,61 +0,0 @@
-<%inherit file="/base.mako"/>
-<%namespace file="/message.mako" import="render_msg" />
-
-%if message:
- ${render_msg( message, status )}
-%endif
-
-<div class="warningmessage">
- The core Galaxy development team does not maintain the contents of many Galaxy tool shed repositories. Some repository tools
- may include code that produces malicious behavior, so be aware of what you are installing.
- <p/>
- If you discover a repository that causes problems after installation, contact <a href="http://wiki.g2.bx.psu.edu/Support" target="_blank">Galaxy support</a>,
- sending all necessary information, and appropriate action will be taken.
- <p/>
- <a href="http://wiki.g2.bx.psu.edu/Tool%20Shed#Contacting_the_owner_of_a_repository" target="_blank">Contact the repository owner</a> for general questions
- or concerns.
-</div>
-<br/>
-<div class="warningmessage">
- Installation may take a while, depending upon the size of the repository contents. Wait until a message is displayed in your
- browser after clicking the <b>Install</b> button below.
-</div>
-<br/>
-
-<div class="toolForm">
- <div class="toolFormTitle">Choose section to load tools into tool panel</div>
- <div class="toolFormBody">
- <form name="select_tool_panel_section" id="select_tool_panel_section" action="${h.url_for( controller='admin', action='install_tool_shed_repository', tool_shed_url=tool_shed_url, repo_info_dict=repo_info_dict )}" method="post" >
- %if shed_tool_conf_select_field:
- <div class="form-row">
- <label>Shed tool configuration file:</label>
- ${shed_tool_conf_select_field.get_html()}
- <div class="toolParamHelp" style="clear: both;">
- Your Galaxy instance is configured with ${len( shed_tool_conf_select_field.options )} shed tool configuration files,
- so choose one in which to configure the installed tools.
- </div>
- </div>
- <div style="clear: both"></div>
- %else:
- <input type="hidden" name="shed_tool_conf" value="${shed_tool_conf}"/>
- %endif
- <div class="form-row">
- <label>Add new tool panel section:</label>
- <input name="new_tool_panel_section" type="textfield" value="${new_tool_panel_section}" size="40"/>
- <div class="toolParamHelp" style="clear: both;">
- Add a new tool panel section or choose an existing section in your tool panel below to contain the installed tools.
- </div>
- </div>
- <div class="form-row">
- <label>Select existing tool panel section:</label>
- ${tool_panel_section_select_field.get_html()}
- <div class="toolParamHelp" style="clear: both;">
- Choose an existing section in your tool panel to contain the installed tools.
- </div>
- </div>
- <div class="form-row">
- <input type="submit" name="select_tool_panel_section_button" value="Install"/>
- </div>
- </form>
- </div>
-</div>
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/admin/tool_shed_repository/browse_repository.mako
--- a/templates/admin/tool_shed_repository/browse_repository.mako
+++ b/templates/admin/tool_shed_repository/browse_repository.mako
@@ -1,5 +1,6 @@
<%inherit file="/base.mako"/><%namespace file="/message.mako" import="render_msg" />
+<%namespace file="/admin/tool_shed_repository/common.mako" import="*" /><% from galaxy.web.base.controller import tool_shed_encode, tool_shed_decode %>
@@ -7,8 +8,8 @@
<ul class="manage-table-actions"><li><a class="action-button" id="repository-${repository.id}-popup" class="menubutton">Repository Actions</a></li><div popupmenu="repository-${repository.id}-popup">
- <a class="action-button" href="${h.url_for( controller='admin', action='manage_tool_shed_repository', id=trans.security.encode_id( repository.id ) )}">Manage repository</a>
- <a class="action-button" href="${h.url_for( controller='admin', action='check_for_updates', id=trans.security.encode_id( repository.id ) )}">Get updates</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='manage_repository', id=trans.security.encode_id( repository.id ) )}">Manage repository</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='check_for_updates', id=trans.security.encode_id( repository.id ) )}">Get updates</a></div></ul>
@@ -16,112 +17,4 @@
${render_msg( message, status )}
%endif
-<div class="toolForm">
- <div class="toolFormTitle">Installed tool shed repository '${repository.name}'</div>
- <div class="toolFormBody">
-
- %if tool_dicts:
- <div class="form-row">
- <table width="100%">
- <tr bgcolor="#D8D8D8" width="100%">
- <td><b>Tools</b></td>
- </tr>
- </table>
- </div>
- <div class="form-row">
- <table class="grid">
- <tr>
- <td><b>name</b></td>
- <td><b>description</b></td>
- <td><b>version</b></td>
- <td><b>requirements</b></td>
- </tr>
- %for tool_dict in tool_dicts:
- <tr>
- <td>${tool_dict[ 'name' ]}</div>
- </td>
- <td>${tool_dict[ 'description' ]}</td>
- <td>${tool_dict[ 'version' ]}</td>
- <td>
- <%
- if 'requirements' in tool_dict:
- requirements = tool_dict[ 'requirements' ]
- else:
- requirements = None
- %>
- %if requirements:
- <%
- requirements_str = ''
- for requirement_dict in tool_dict[ 'requirements' ]:
- requirements_str += '%s (%s), ' % ( requirement_dict[ 'name' ], requirement_dict[ 'type' ] )
- requirements_str = requirements_str.rstrip( ', ' )
- %>
- ${requirements_str}
- %else:
- none
- %endif
- </td>
- </tr>
- %endfor
- </table>
- </div>
- <div style="clear: both"></div>
- %endif
- %if workflow_dicts:
- <div class="form-row">
- <table width="100%">
- <tr bgcolor="#D8D8D8" width="100%">
- <td><b>Workflows</b></td>
- </tr>
- </table>
- </div>
- <div style="clear: both"></div>
- <div class="form-row">
- <table class="grid">
- <tr>
- <td><b>name</b></td>
- <td><b>steps</b></td>
- <td><b>format-version</b></td>
- <td><b>annotation</b></td>
- </tr>
- <% index = 0 %>
- %for wf_dict in workflow_dicts:
- <%
- full_path = wf_dict[ 'full_path' ]
- workflow_dict = wf_dict[ 'workflow_dict' ]
- workflow_name = workflow_dict[ 'name' ]
- if 'steps' in workflow_dict:
- ## Initially steps were not stored in the metadata record.
- steps = workflow_dict[ 'steps' ]
- else:
- steps = []
- format_version = workflow_dict[ 'format-version' ]
- annotation = workflow_dict[ 'annotation' ]
- %>
- <tr>
- <td>
- <div class="menubutton" style="float: left;" id="workflow-${index}-popup">
- ${workflow_name}
- <div popupmenu="workflow-${index}-popup">
- <a class="action-button" href="${h.url_for( controller='workflow', action='import_workflow', installed_repository_file=full_path, repository_id=trans.security.encode_id( repository.id ) )}">Import to Galaxy</a>
- </div>
- </div>
- </td>
- <td>
- %if 'steps' in workflow_dict:
- ${len( steps )}
- %else:
- unknown
- %endif
- </td>
- <td>${format_version}</td>
- <td>${annotation}</td>
- </tr>
- <% index += 1 %>
- %endfor
- </table>
- </div>
- <div style="clear: both"></div>
- %endif
- </div>
-</div>
+${render_metadata( repository, can_reset_metadata=False )}
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/admin/tool_shed_repository/common.mako
--- /dev/null
+++ b/templates/admin/tool_shed_repository/common.mako
@@ -0,0 +1,171 @@
+<%def name="render_metadata( repository, can_reset_metadata=False )">
+ <div class="toolForm">
+ <div class="toolFormTitle">Repository contents</div>
+ <div class="toolFormBody">
+ <% metadata = repository.metadata %>
+ %if metadata:
+ %if 'tools' in metadata:
+ <div class="form-row">
+ <table width="100%">
+ <tr bgcolor="#D8D8D8" width="100%">
+ <td><b>Tools</b><i> - click the name to view tool related information</i></td>
+ </tr>
+ </table>
+ </div>
+ <div class="form-row">
+ <% tool_dicts = metadata[ 'tools' ] %>
+ <table class="grid">
+ <tr>
+ <td><b>name</b></td>
+ <td><b>description</b></td>
+ <td><b>version</b></td>
+ <td><b>requirements</b></td>
+ </tr>
+ %for tool_dict in tool_dicts:
+ <tr>
+ <td>
+ <a class="view-info" href="${h.url_for( controller='admin_toolshed', action='view_tool_metadata', repository_id=trans.security.encode_id( repository.id ), tool_id=tool_dict[ 'id' ] )}">
+ ${tool_dict[ 'name' ]}
+ </a>
+ </td>
+ <td>${tool_dict[ 'description' ]}</td>
+ <td>${tool_dict[ 'version' ]}</td>
+ <td>
+ <%
+ if 'requirements' in tool_dict:
+ requirements = tool_dict[ 'requirements' ]
+ else:
+ requirements = None
+ %>
+ %if requirements:
+ <%
+ requirements_str = ''
+ for requirement_dict in tool_dict[ 'requirements' ]:
+ requirements_str += '%s (%s), ' % ( requirement_dict[ 'name' ], requirement_dict[ 'type' ] )
+ requirements_str = requirements_str.rstrip( ', ' )
+ %>
+ ${requirements_str}
+ %else:
+ none
+ %endif
+ </td>
+ </tr>
+ %endfor
+ </table>
+ </div>
+ <div style="clear: both"></div>
+ %endif
+ %if 'workflows' in metadata:
+ <div class="form-row">
+ <table width="100%">
+ <tr bgcolor="#D8D8D8" width="100%">
+ <td><b>Workflows</b><i> - click the name to import</i></td>
+ </tr>
+ </table>
+ </div>
+ <div style="clear: both"></div>
+ <div class="form-row">
+ <% workflow_tups = metadata[ 'workflows' ] %>
+ <table class="grid">
+ <tr>
+ <td><b>name</b></td>
+ <td><b>steps</b></td>
+ <td><b>format-version</b></td>
+ <td><b>annotation</b></td>
+ </tr>
+ <% index = 0 %>
+ %for workflow_tup in workflow_tups:
+ <%
+ import os.path
+ relative_path = workflow_tup[ 0 ]
+ full_path = os.path.abspath( relative_path )
+ workflow_dict = workflow_tup[ 1 ]
+ workflow_name = workflow_dict[ 'name' ]
+ if 'steps' in workflow_dict:
+ ## Initially steps were not stored in the metadata record.
+ steps = workflow_dict[ 'steps' ]
+ else:
+ steps = []
+ format_version = workflow_dict[ 'format-version' ]
+ annotation = workflow_dict[ 'annotation' ]
+ %>
+ <tr>
+ <td>
+ <div class="menubutton" style="float: left;" id="workflow-${index}-popup">
+ ${workflow_name}
+ <div popupmenu="workflow-${index}-popup">
+ <a class="action-button" href="${h.url_for( controller='workflow', action='import_workflow', installed_repository_file=full_path, repository_id=trans.security.encode_id( repository.id ) )}">Import to Galaxy</a>
+ </div>
+ </div>
+ </td>
+ <td>
+ %if steps:
+ ${len( steps )}
+ %else:
+ unknown
+ %endif
+ </td>
+ <td>${format_version}</td>
+ <td>${annotation}</td>
+ </tr>
+ <% index += 1 %>
+ %endfor
+ </table>
+ </div>
+ <div style="clear: both"></div>
+ %endif
+ %if 'datatypes' in metadata:
+ <div class="form-row">
+ <table width="100%">
+ <tr bgcolor="#D8D8D8" width="100%">
+ <td><b>Data types</b></td>
+ </tr>
+ </table>
+ </div>
+ <div style="clear: both"></div>
+ <div class="form-row">
+ <% datatypes_dicts = metadata[ 'datatypes' ] %>
+ <table class="grid">
+ <tr>
+ <td><b>extension</b></td>
+ <td><b>dtype</b></td>
+ <td><b>mimetype</b></td>
+ </tr>
+ %for datatypes_dict in datatypes_dicts:
+ <%
+ extension = datatypes_dict[ 'extension' ]
+ dtype = datatypes_dict[ 'dtype' ]
+ mimetype = datatypes_dict[ 'mimetype' ]
+ %>
+ <tr>
+ <td>${extension}</td>
+ <td>${dtype}</td>
+ <td>
+ %if mimetype:
+ ${mimetype}
+ %else:
+
+ %endif
+ </td>
+ </tr>
+ %endfor
+ </table>
+ </div>
+ <div style="clear: both"></div>
+ %endif
+ %endif
+ %if can_reset_metadata:
+ <form name="set_metadata" action="${h.url_for( controller='admin_toolshed', action='manage_repository', id=trans.security.encode_id( repository.id ) )}" method="post">
+ <div class="form-row">
+ <div style="float: left; width: 250px; margin-right: 10px;">
+ <input type="submit" name="set_metadata_button" value="Reset metadata"/>
+ </div>
+ <div class="toolParamHelp" style="clear: both;">
+ Inspect the repository and reset the above attributes.
+ </div>
+ </div>
+ </form>
+ %endif
+ </div>
+ </div>
+</%def>
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/admin/tool_shed_repository/manage_repository.mako
--- a/templates/admin/tool_shed_repository/manage_repository.mako
+++ b/templates/admin/tool_shed_repository/manage_repository.mako
@@ -1,12 +1,13 @@
<%inherit file="/base.mako"/><%namespace file="/message.mako" import="render_msg" />
+<%namespace file="/admin/tool_shed_repository/common.mako" import="*" /><br/><br/><ul class="manage-table-actions"><li><a class="action-button" id="repository-${repository.id}-popup" class="menubutton">Repository Actions</a></li><div popupmenu="repository-${repository.id}-popup">
- <a class="action-button" href="${h.url_for( controller='admin', action='browse_tool_shed_repository', id=trans.security.encode_id( repository.id ) )}">Browse repository</a>
- <a class="action-button" href="${h.url_for( controller='admin', action='check_for_updates', id=trans.security.encode_id( repository.id ) )}">Get updates</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='browse_repository', id=trans.security.encode_id( repository.id ) )}">Browse repository</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='check_for_updates', id=trans.security.encode_id( repository.id ) )}">Get updates</a></div></ul>
@@ -15,9 +16,9 @@
%endif
<div class="toolForm">
- <div class="toolFormTitle">${repository.name}</div>
+ <div class="toolFormTitle">Installed tool shed repository '${repository.name}'</div><div class="toolFormBody">
- <form name="edit_repository" id="edit_repository" action="${h.url_for( controller='admin', action='manage_tool_shed_repository', id=trans.security.encode_id( repository.id ) )}" method="post" >
+ <form name="edit_repository" id="edit_repository" action="${h.url_for( controller='admin_toolshed', action='manage_repository', id=trans.security.encode_id( repository.id ) )}" method="post" ><div class="form-row"><label>Tool shed:</label>
${repository.tool_shed}
@@ -55,3 +56,6 @@
</form></div></div>
+<p/>
+${render_metadata( repository, can_reset_metadata=True )}
+<p/>
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/admin/tool_shed_repository/select_tool_panel_section.mako
--- /dev/null
+++ b/templates/admin/tool_shed_repository/select_tool_panel_section.mako
@@ -0,0 +1,61 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+<div class="warningmessage">
+ The core Galaxy development team does not maintain the contents of many Galaxy tool shed repositories. Some repository tools
+ may include code that produces malicious behavior, so be aware of what you are installing.
+ <p/>
+ If you discover a repository that causes problems after installation, contact <a href="http://wiki.g2.bx.psu.edu/Support" target="_blank">Galaxy support</a>,
+ sending all necessary information, and appropriate action will be taken.
+ <p/>
+ <a href="http://wiki.g2.bx.psu.edu/Tool%20Shed#Contacting_the_owner_of_a_repository" target="_blank">Contact the repository owner</a> for general questions
+ or concerns.
+</div>
+<br/>
+<div class="warningmessage">
+ Installation may take a while, depending upon the size of the repository contents. Wait until a message is displayed in your
+ browser after clicking the <b>Install</b> button below.
+</div>
+<br/>
+
+<div class="toolForm">
+ <div class="toolFormTitle">Choose section to load tools into tool panel</div>
+ <div class="toolFormBody">
+ <form name="select_tool_panel_section" id="select_tool_panel_section" action="${h.url_for( controller='admin_toolshed', action='install_repository', tool_shed_url=tool_shed_url, repo_info_dict=repo_info_dict )}" method="post" >
+ %if shed_tool_conf_select_field:
+ <div class="form-row">
+ <label>Shed tool configuration file:</label>
+ ${shed_tool_conf_select_field.get_html()}
+ <div class="toolParamHelp" style="clear: both;">
+ Your Galaxy instance is configured with ${len( shed_tool_conf_select_field.options )} shed tool configuration files,
+ so choose one in which to configure the installed tools.
+ </div>
+ </div>
+ <div style="clear: both"></div>
+ %else:
+ <input type="hidden" name="shed_tool_conf" value="${shed_tool_conf}"/>
+ %endif
+ <div class="form-row">
+ <label>Add new tool panel section:</label>
+ <input name="new_tool_panel_section" type="textfield" value="${new_tool_panel_section}" size="40"/>
+ <div class="toolParamHelp" style="clear: both;">
+ Add a new tool panel section or choose an existing section in your tool panel below to contain the installed tools.
+ </div>
+ </div>
+ <div class="form-row">
+ <label>Select existing tool panel section:</label>
+ ${tool_panel_section_select_field.get_html()}
+ <div class="toolParamHelp" style="clear: both;">
+ Choose an existing section in your tool panel to contain the installed tools.
+ </div>
+ </div>
+ <div class="form-row">
+ <input type="submit" name="select_tool_panel_section_button" value="Install"/>
+ </div>
+ </form>
+ </div>
+</div>
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/admin/tool_shed_repository/view_tool_metadata.mako
--- /dev/null
+++ b/templates/admin/tool_shed_repository/view_tool_metadata.mako
@@ -0,0 +1,155 @@
+<%inherit file="/base.mako"/>
+<%namespace file="/message.mako" import="render_msg" />
+
+<br/><br/>
+<ul class="manage-table-actions">
+ <li><a class="action-button" id="repository-${repository.id}-popup" class="menubutton">Repository Actions</a></li>
+ <div popupmenu="repository-${repository.id}-popup">
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='browse_repository', id=trans.security.encode_id( repository.id ) )}">Browse repository</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='manage_repository', id=trans.security.encode_id( repository.id ) )}">Manage repository</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='check_for_updates', id=trans.security.encode_id( repository.id ) )}">Get updates</a>
+ </div>
+</ul>
+
+%if message:
+ ${render_msg( message, status )}
+%endif
+
+%if metadata:
+ <p/>
+ <div class="toolForm">
+ <div class="toolFormTitle">${metadata[ 'name' ]} tool metadata</div>
+ <div class="toolFormBody">
+ <div class="form-row">
+ <label>Name:</label>
+ ${metadata[ 'name' ]}
+ <div style="clear: both"></div>
+ </div>
+ %if 'description' in metadata:
+ <div class="form-row">
+ <label>Description:</label>
+ ${metadata[ 'description' ]}
+ <div style="clear: both"></div>
+ </div>
+ %endif
+ %if 'id' in metadata:
+ <div class="form-row">
+ <label>Id:</label>
+ ${metadata[ 'id' ]}
+ <div style="clear: both"></div>
+ </div>
+ %endif
+ %if 'guid' in metadata:
+ <div class="form-row">
+ <label>Guid:</label>
+ ${metadata[ 'guid' ]}
+ <div style="clear: both"></div>
+ </div>
+ %endif
+ %if 'version' in metadata:
+ <div class="form-row">
+ <label>Version:</label>
+ ${metadata[ 'version' ]}
+ <div style="clear: both"></div>
+ </div>
+ %endif
+ %if 'version_string_cmd' in metadata:
+ <div class="form-row">
+ <label>Version command string:</label>
+ ${metadata[ 'version_string_cmd' ]}
+ <div style="clear: both"></div>
+ </div>
+ %endif
+ %if tool:
+ <div class="form-row">
+ <label>Command:</label>
+ <pre>${tool.command}</pre>
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>Interpreter:</label>
+ ${tool.interpreter}
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>Is multi-byte:</label>
+ ${tool.is_multi_byte}
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>Forces a history refresh:</label>
+ ${tool.force_history_refresh}
+ <div style="clear: both"></div>
+ </div>
+ <div class="form-row">
+ <label>Parallelism:</label>
+ ${tool.parallelism}
+ <div style="clear: both"></div>
+ </div>
+ %endif
+ <%
+ if 'requirements' in metadata:
+ requirements = metadata[ 'requirements' ]
+ else:
+ requirements = None
+ %>
+ %if requirements:
+ <%
+ requirements_str = ''
+ for requirement_dict in metadata[ 'requirements' ]:
+ requirements_str += '%s (%s), ' % ( requirement_dict[ 'name' ], requirement_dict[ 'type' ] )
+ requirements_str = requirements_str.rstrip( ', ' )
+ %>
+ <div class="form-row">
+ <label>Requirements:</label>
+ ${requirements_str}
+ <div style="clear: both"></div>
+ </div>
+ %endif
+ <%
+ if 'tests' in metadata:
+ tests = metadata[ 'tests' ]
+ else:
+ tests = None
+ %>
+ %if tests:
+ <div class="form-row">
+ <label>Functional tests:</label></td>
+ <table class="grid">
+ <tr>
+ <td><b>name</b></td>
+ <td><b>inputs</b></td>
+ <td><b>outputs</b></td>
+ <td><b>required files</b></td>
+ </tr>
+ %for test_dict in tests:
+ <%
+ inputs = test_dict[ 'inputs' ]
+ outputs = test_dict[ 'outputs' ]
+ required_files = test_dict[ 'required_files' ]
+ %>
+ <tr>
+ <td>${test_dict[ 'name' ]}</td>
+ <td>
+ %for input in inputs:
+ <b>${input[0]}:</b> ${input[1]}<br/>
+ %endfor
+ </td>
+ <td>
+ %for output in outputs:
+ <b>${output[0]}:</b> ${output[1]}<br/>
+ %endfor
+ </td>
+ <td>
+ %for required_file in required_files:
+ ${required_file[0]}<br/>
+ %endfor
+ </td>
+ </tr>
+ %endfor
+ </table>
+ </div>
+ %endif
+ </div>
+ </div>
+%endif
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/webapps/community/base_panels.mako
--- a/templates/webapps/community/base_panels.mako
+++ b/templates/webapps/community/base_panels.mako
@@ -34,7 +34,8 @@
<div class="submenu"><ul><li><a target="_blank" href="${app.config.get( "support_url", "http://wiki.g2.bx.psu.edu/Support" )}">Support</a></li>
- <li><a target="_blank" href="${app.config.get( "wiki_url", "http://wiki.g2.bx.psu.edu/" )}">Galaxy Wiki</a></li>
+ <li><a target="_blank" href="${app.config.get( "wiki_url", "http://wiki.g2.bx.psu.edu/Tool%20Shed" )}">Tool shed wiki</a></li>
+ <li><a target="_blank" href="${app.config.get( "wiki_url", "http://wiki.g2.bx.psu.edu/" )}">Galaxy wiki</a></li><li><a target="_blank" href="${app.config.get( "screencasts_url", "http://galaxycast.org" )}">Video tutorials (screencasts)</a></li><li><a target="_blank" href="${app.config.get( "citation_url", "http://wiki.g2.bx.psu.edu/Citing%20Galaxy" )}">How to Cite Galaxy</a></li></ul>
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/webapps/galaxy/admin/index.mako
--- a/templates/webapps/galaxy/admin/index.mako
+++ b/templates/webapps/galaxy/admin/index.mako
@@ -67,7 +67,7 @@
<div class="toolTitle"><a href="${h.url_for( controller='admin', action='memdump' )}" target="galaxy_main">Profile memory usage</a></div><div class="toolTitle"><a href="${h.url_for( controller='admin', action='jobs' )}" target="galaxy_main">Manage jobs</a></div>
%if cloned_repositories:
- <div class="toolTitle"><a href="${h.url_for( controller='admin', action='browse_tool_shed_repositories' )}" target="galaxy_main">Manage installed tool shed repositories</a></div>
+ <div class="toolTitle"><a href="${h.url_for( controller='admin_toolshed', action='browse_repositories' )}" target="galaxy_main">Manage installed tool shed repositories</a></div>
%endif
</div></div>
@@ -76,7 +76,7 @@
<div class="toolSectionTitle">Tool sheds</div><div class="toolSectionBody"><div class="toolSectionBg">
- <div class="toolTitle"><a href="${h.url_for( controller='admin', action='browse_tool_sheds' )}" target="galaxy_main">Search and browse tool sheds</a></div>
+ <div class="toolTitle"><a href="${h.url_for( controller='admin_toolshed', action='browse_tool_sheds' )}" target="galaxy_main">Search and browse tool sheds</a></div></div></div>
%endif
diff -r 87dc2336c1892878b26a110bb1857722fd3f4903 -r 5340450cdab2a7887d8d9e9186ae438d7b2aee93 templates/webapps/galaxy/admin/tool_sheds.mako
--- a/templates/webapps/galaxy/admin/tool_sheds.mako
+++ b/templates/webapps/galaxy/admin/tool_sheds.mako
@@ -22,12 +22,12 @@
<tr class="libraryTitle"><td><div style="float: left; margin-left: 1px;" class="menubutton split popup" id="dataset-${shed_id}-popup">
- <a class="view-info" href="${h.url_for( controller='admin', action='browse_tool_shed', tool_shed_url=url )}">${name}</a>
+ <a class="view-info" href="${h.url_for( controller='admin_toolshed', action='browse_tool_shed', tool_shed_url=url )}">${name}</a></div><div popupmenu="dataset-${shed_id}-popup">
- <a class="action-button" href="${h.url_for( controller='admin', action='browse_tool_shed', tool_shed_url=url )}">Browse valid repositories</a>
- <a class="action-button" href="${h.url_for( controller='admin', action='find_tools_in_tool_shed', tool_shed_url=url )}">Search for valid tools</a>
- <a class="action-button" href="${h.url_for( controller='admin', action='find_workflows_in_tool_shed', tool_shed_url=url )}">Search for workflows</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='browse_tool_shed', tool_shed_url=url )}">Browse valid repositories</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='find_tools_in_tool_shed', tool_shed_url=url )}">Search for valid tools</a>
+ <a class="action-button" href="${h.url_for( controller='admin_toolshed', action='find_workflows_in_tool_shed', tool_shed_url=url )}">Search for workflows</a></div></td></tr>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Trackster bug fix: add missing object reference.
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/87dc2336c189/
changeset: 87dc2336c189
user: jgoecks
date: 2011-12-02 20:20:29
summary: Trackster bug fix: add missing object reference.
affected #: 1 file
diff -r 107d3f0a59fae43ca9b62fe30453caed898993e6 -r 87dc2336c1892878b26a110bb1857722fd3f4903 static/scripts/trackster.js
--- a/static/scripts/trackster.js
+++ b/static/scripts/trackster.js
@@ -2355,6 +2355,7 @@
more_across_icon = $("<a href='javascript:void(0);'/>").addClass("icon more-across").appendTo(message_div);
// Set up actions for icons.
+ var tile = this;
more_down_icon.click(function() {
// Mark tile as stale, request more data, and redraw track.
tile.stale = true;
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: (1) Fail gracefully when visualization cannot be saved and (2) fix bug in setting modal's minimum width.
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/107d3f0a59fa/
changeset: 107d3f0a59fa
user: jgoecks
date: 2011-12-02 19:56:45
summary: (1) Fail gracefully when visualization cannot be saved and (2) fix bug in setting modal's minimum width.
affected #: 2 files
diff -r 4a9d57a68f91b17fa47b8e5b73e54b6bdcbc912d -r 107d3f0a59fae43ca9b62fe30453caed898993e6 static/scripts/galaxy.panels.js
--- a/static/scripts/galaxy.panels.js
+++ b/static/scripts/galaxy.panels.js
@@ -211,7 +211,7 @@
}
var body_elt = $( ".dialog-box" ).find( ".body" );
// Clear min-width to allow for modal to take size of new body.
- body_elt.css("min-width", "");
+ body_elt.css("min-width", "0px");
$( ".dialog-box" ).find( ".body" ).html( body );
if ( ! $(".dialog-box-container").is( ":visible" ) ) {
$("#overlay").show();
diff -r 4a9d57a68f91b17fa47b8e5b73e54b6bdcbc912d -r 107d3f0a59fae43ca9b62fe30453caed898993e6 templates/tracks/browser.mako
--- a/templates/tracks/browser.mako
+++ b/templates/tracks/browser.mako
@@ -230,7 +230,7 @@
$("#save-icon").click( function() {
// Show saving dialog box
- show_modal("Saving...", "<img src='${h.url_for('/static/images/yui/rel_interstitial_loading.gif')}'/>");
+ show_modal("Saving...", "progress");
// Save bookmarks.
var bookmarks = [];
@@ -267,7 +267,10 @@
// Needed to set URL when first saving a visualization.
window.history.pushState({}, "", vis_info.url);
},
- error: function() { alert("Could not save visualization"); }
+ error: function() {
+ show_modal( "Could Not Save", "Could not save visualization. Please try again later.",
+ { "Close" : hide_modal } );
+ }
});
});
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Trackster: default to datasets in current history when adding tracks.
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/4a9d57a68f91/
changeset: 4a9d57a68f91
user: jgoecks
date: 2011-12-02 18:32:14
summary: Trackster: default to datasets in current history when adding tracks.
affected #: 2 files
diff -r da1ccd08865963c99a55b621af481023b61e3b32 -r 4a9d57a68f91b17fa47b8e5b73e54b6bdcbc912d lib/galaxy/web/controllers/tracks.py
--- a/lib/galaxy/web/controllers/tracks.py
+++ b/lib/galaxy/web/controllers/tracks.py
@@ -735,6 +735,15 @@
return self.histories_grid( trans, **kwargs )
@web.expose
+ @web.require_login( "see current history's datasets that can added to this visualization" )
+ def list_current_history_datasets( self, trans, **kwargs ):
+ """ List a history's datasets that can be added to a visualization. """
+
+ kwargs[ 'f-history' ] = trans.security.encode_id( trans.get_history().id )
+ kwargs[ 'show_item_checkboxes' ] = 'True'
+ return self.list_history_datasets( trans, **kwargs )
+
+ @web.expose
@web.require_login( "see a history's datasets that can added to this visualization" )
def list_history_datasets( self, trans, **kwargs ):
"""List a history's datasets that can be added to a visualization."""
diff -r da1ccd08865963c99a55b621af481023b61e3b32 -r 4a9d57a68f91b17fa47b8e5b73e54b6bdcbc912d templates/tracks/browser.mako
--- a/templates/tracks/browser.mako
+++ b/templates/tracks/browser.mako
@@ -63,7 +63,7 @@
*/
var add_tracks = function() {
$.ajax({
- url: "${h.url_for( action='list_histories' )}",
+ url: "${h.url_for( action='list_current_history_datasets' )}",
data: { "f-dbkey": view.dbkey },
error: function() { alert( "Grid failed" ); },
success: function(table_html) {
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Trackster: interpret feature track data using 0-based, half-open coordinate system.
by Bitbucket 02 Dec '11
by Bitbucket 02 Dec '11
02 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/da1ccd088659/
changeset: da1ccd088659
user: jgoecks
date: 2011-12-02 17:10:04
summary: Trackster: interpret feature track data using 0-based, half-open coordinate system.
affected #: 1 file
diff -r 9eddcedc574e0a071fce78f51b9352d3efb7635c -r da1ccd08865963c99a55b621af481023b61e3b32 static/scripts/trackster.js
--- a/static/scripts/trackster.js
+++ b/static/scripts/trackster.js
@@ -4294,6 +4294,8 @@
/**
* Abstract object for painting feature tracks. Subclasses must implement draw_element() for painting to work.
+ * Painter uses a 0-based, half-open coordinate system; start coordinate is closed--included--and the end is open.
+ * This coordinate system matches the BED format.
*/
var FeaturePainter = function(data, view_start, view_end, prefs, mode, alpha_scaler, height_scaler) {
Painter.call(this, data, view_start, view_end, prefs, mode);
@@ -4419,7 +4421,8 @@
var
feature_uid = feature[0],
feature_start = feature[1],
- feature_end = feature[2],
+ // -1 because end is not included in feature; see FeaturePainter documentation for details.
+ feature_end = feature[2] - 1,
feature_name = feature[3],
f_start = Math.floor( Math.max(0, (feature_start - tile_low) * w_scale) ),
f_end = Math.ceil( Math.min(width, Math.max(0, (feature_end - tile_low) * w_scale)) ),
@@ -4537,7 +4540,8 @@
for (var k = 0, k_len = feature_blocks.length; k < k_len; k++) {
var block = feature_blocks[k],
block_start = Math.floor( Math.max(0, (block[0] - tile_low) * w_scale) ),
- block_end = Math.ceil( Math.min(width, Math.max((block[1] - tile_low) * w_scale)) ),
+ // -1 because end is not included in feature; see FeaturePainter documentation for details.
+ block_end = Math.ceil( Math.min(width, Math.max((block[1] - 1 - tile_low) * w_scale)) ),
last_block_start, last_block_end;
// Skip drawing if block not on tile.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Trackster: fix bug in determining when to display filters.
by Bitbucket 01 Dec '11
by Bitbucket 01 Dec '11
01 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/9eddcedc574e/
changeset: 9eddcedc574e
user: jgoecks
date: 2011-12-01 21:01:28
summary: Trackster: fix bug in determining when to display filters.
affected #: 1 file
diff -r eb9d842f9faf3b7c13d40c387b528b256f29b31a -r 9eddcedc574e0a071fce78f51b9352d3efb7635c static/scripts/trackster.js
--- a/static/scripts/trackster.js
+++ b/static/scripts/trackster.js
@@ -3459,13 +3459,17 @@
}
// Determine if filters are available; this is based on the tiles' data.
+ // Criteria for filter to be available: (a) it is applicable to tile data and (b) filter min != filter max.
var filters_available = false,
- example_feature;
+ example_feature,
+ filter;
for (var i = 0; i < tiles.length; i++) {
if (tiles[i].data.length) {
example_feature = tiles[i].data[0];
for (var f = 0; f < filters.length; f++) {
- if (filters[f].applies_to(example_feature)) {
+ filter = filters[f];
+ if ( filter.applies_to(example_feature) &&
+ filter.min !== filter.max ) {
filters_available = true;
break;
}
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: dan: Temporarily remove Yogeshwar's recently added microsatellite tools from tool_conf.xml.sample to prevent buildbot testing.
by Bitbucket 01 Dec '11
by Bitbucket 01 Dec '11
01 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/eb9d842f9faf/
changeset: eb9d842f9faf
user: dan
date: 2011-12-01 16:57:51
summary: Temporarily remove Yogeshwar's recently added microsatellite tools from tool_conf.xml.sample to prevent buildbot testing.
affected #: 1 file
diff -r effb888d61e1214ac57222bee419ac378fed606f -r eb9d842f9faf3b7c13d40c387b528b256f29b31a tool_conf.xml.sample
--- a/tool_conf.xml.sample
+++ b/tool_conf.xml.sample
@@ -192,8 +192,8 @@
<tool file="regVariation/compute_motif_frequencies_for_all_motifs.xml" /><tool file="regVariation/categorize_elements_satisfying_criteria.xml" />s
<tool file="regVariation/draw_stacked_barplots.xml" />
- <tool file="regVariation/multispecies_MicrosatDataGenerator_interrupted_GALAXY.xml" />
- <tool file="regVariation/microsatellite_birthdeath.xml" />
+ <!-- <tool file="regVariation/multispecies_MicrosatDataGenerator_interrupted_GALAXY.xml" />
+ <tool file="regVariation/microsatellite_birthdeath.xml" /> --></section><section name="Multiple regression" id="multReg"><tool file="regVariation/linear_regression.xml" />
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: When exporting a history archive, use user dataset names rather than Galaxy dataset names. Fixes #680
by Bitbucket 01 Dec '11
by Bitbucket 01 Dec '11
01 Dec '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/effb888d61e1/
changeset: effb888d61e1
user: jgoecks
date: 2011-12-01 16:22:46
summary: When exporting a history archive, use user dataset names rather than Galaxy dataset names. Fixes #680
affected #: 1 file
diff -r a2ae1bb435378eefc659941b790ec15f3a0a8584 -r effb888d61e1214ac57222bee419ac378fed606f lib/galaxy/tools/imp_exp/export_history.py
--- a/lib/galaxy/tools/imp_exp/export_history.py
+++ b/lib/galaxy/tools/imp_exp/export_history.py
@@ -10,6 +10,14 @@
from galaxy.util.json import *
import optparse, sys, os, tempfile, tarfile
+def get_dataset_filename( name, ext ):
+ """
+ Builds a filename for a dataset using its name an extension.
+ """
+ valid_chars = '.,^_-()[]0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'
+ base = ''.join( c in valid_chars and c or '_' for c in name )
+ return base + ".%s" % ext
+
def create_archive( history_attrs_file, datasets_attrs_file, jobs_attrs_file, out_file, gzip=False ):
""" Create archive from the given attribute/metadata files and save it to out_file. """
tarfile_mode = "w"
@@ -37,7 +45,8 @@
# TODO: security check to ensure that files added are in Galaxy dataset directory?
for dataset_attrs in datasets_attrs:
dataset_file_name = dataset_attrs[ 'file_name' ] # Full file name.
- dataset_archive_name = os.path.join( "datasets", os.path.split( dataset_file_name )[-1] )
+ dataset_archive_name = os.path.join( 'datasets',
+ get_dataset_filename( dataset_attrs[ 'name' ], dataset_attrs[ 'extension' ] ) )
history_archive.add( dataset_file_name, arcname=dataset_archive_name )
# Update dataset filename to be archive name.
dataset_attrs[ 'file_name' ] = dataset_archive_name
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Apply HTML escaping when editing dataset name.
by Bitbucket 30 Nov '11
by Bitbucket 30 Nov '11
30 Nov '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/a2ae1bb43537/
changeset: a2ae1bb43537
user: jgoecks
date: 2011-12-01 04:50:12
summary: Apply HTML escaping when editing dataset name.
affected #: 1 file
diff -r 1638bc41454922e24dfa5e01d840fbf5ea505782 -r a2ae1bb435378eefc659941b790ec15f3a0a8584 templates/dataset/edit_attributes.mako
--- a/templates/dataset/edit_attributes.mako
+++ b/templates/dataset/edit_attributes.mako
@@ -39,7 +39,7 @@
Name:
</label><div style="float: left; width: 250px; margin-right: 10px;">
- <input type="text" name="name" value="${data.get_display_name()}" size="40"/>
+ <input type="text" name="name" value="${data.get_display_name() | h}" size="40"/></div><div style="clear: both"></div></div>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: dan: Fix for 2ee4a6b8419b where chromInfo from custom dbkeys when no custom dbkeys have been defined would cause an error.
by Bitbucket 30 Nov '11
by Bitbucket 30 Nov '11
30 Nov '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/1638bc414549/
changeset: 1638bc414549
user: dan
date: 2011-11-30 23:22:30
summary: Fix for 2ee4a6b8419b where chromInfo from custom dbkeys when no custom dbkeys have been defined would cause an error.
affected #: 1 file
diff -r c6f70cccce287702f18c518175bec4a5e8ace23c -r 1638bc41454922e24dfa5e01d840fbf5ea505782 lib/galaxy/tools/actions/__init__.py
--- a/lib/galaxy/tools/actions/__init__.py
+++ b/lib/galaxy/tools/actions/__init__.py
@@ -192,7 +192,7 @@
incoming[ "chromInfo" ] = db_dataset.file_name
else:
# For custom builds, chrom info resides in converted dataset; for built-in builds, chrom info resides in tool-data/shared.
- if trans.user and ( input_dbkey in trans.user.preferences[ 'dbkeys' ] ):
+ if trans.user and ( 'dbkeys' in trans.user.preferences ) and ( input_dbkey in trans.user.preferences[ 'dbkeys' ] ):
# Custom build.
custom_build_dict = from_json_string( trans.user.preferences[ 'dbkeys' ] )[ input_dbkey ]
build_fasta_dataset = trans.app.model.HistoryDatasetAssociation.get( custom_build_dict[ 'fasta' ] )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: dan: Add no_options validator to built-in reference genome selector in GATK tools.
by Bitbucket 30 Nov '11
by Bitbucket 30 Nov '11
30 Nov '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/c6f70cccce28/
changeset: c6f70cccce28
user: dan
date: 2011-11-30 19:29:21
summary: Add no_options validator to built-in reference genome selector in GATK tools.
affected #: 13 files
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/count_covariates.xml
--- a/tools/gatk/count_covariates.xml
+++ b/tools/gatk/count_covariates.xml
@@ -146,6 +146,7 @@
<options from_data_table="gatk_picard_indexes"><filter type="data_meta" key="dbkey" ref="input_bam" column="dbkey"/></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/depth_of_coverage.xml
--- a/tools/gatk/depth_of_coverage.xml
+++ b/tools/gatk/depth_of_coverage.xml
@@ -194,6 +194,7 @@
<options from_data_table="gatk_picard_indexes"><!-- <filter type="data_meta" key="dbkey" ref="input_bam" column="dbkey"/> does not yet work in a repeat...--></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/indel_realigner.xml
--- a/tools/gatk/indel_realigner.xml
+++ b/tools/gatk/indel_realigner.xml
@@ -124,6 +124,7 @@
<options from_data_table="gatk_picard_indexes"><filter type="data_meta" key="dbkey" ref="input_bam" column="dbkey"/></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/realigner_target_creator.xml
--- a/tools/gatk/realigner_target_creator.xml
+++ b/tools/gatk/realigner_target_creator.xml
@@ -112,6 +112,7 @@
<options from_data_table="gatk_picard_indexes"><filter type="data_meta" key="dbkey" ref="input_bam" column="dbkey"/></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history">
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/table_recalibration.xml
--- a/tools/gatk/table_recalibration.xml
+++ b/tools/gatk/table_recalibration.xml
@@ -129,6 +129,7 @@
<options from_data_table="gatk_picard_indexes"><filter type="data_meta" key="dbkey" ref="input_bam" column="dbkey"/></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/unified_genotyper.xml
--- a/tools/gatk/unified_genotyper.xml
+++ b/tools/gatk/unified_genotyper.xml
@@ -154,6 +154,7 @@
<options from_data_table="gatk_picard_indexes"><!-- <filter type="data_meta" key="dbkey" ref="input_bam" column="dbkey"/> does not yet work in a repeat...--></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/variant_annotator.xml
--- a/tools/gatk/variant_annotator.xml
+++ b/tools/gatk/variant_annotator.xml
@@ -157,6 +157,7 @@
<options from_data_table="gatk_picard_indexes"><filter type="data_meta" key="dbkey" ref="input_variant" column="dbkey"/></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/variant_apply_recalibration.xml
--- a/tools/gatk/variant_apply_recalibration.xml
+++ b/tools/gatk/variant_apply_recalibration.xml
@@ -109,6 +109,7 @@
<options from_data_table="gatk_picard_indexes"><!-- <filter type="data_meta" key="dbkey" ref="variants[0].input_variants" column="dbkey"/> --></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/variant_combine.xml
--- a/tools/gatk/variant_combine.xml
+++ b/tools/gatk/variant_combine.xml
@@ -121,6 +121,7 @@
<options from_data_table="gatk_picard_indexes"><!-- <filter type="data_meta" key="dbkey" ref="input_variants.input_variant" column="dbkey"/> --></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/variant_eval.xml
--- a/tools/gatk/variant_eval.xml
+++ b/tools/gatk/variant_eval.xml
@@ -160,6 +160,7 @@
<options from_data_table="gatk_picard_indexes"><!-- <filter type="data_meta" key="dbkey" ref="input_variant" column="dbkey"/> --></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/variant_filtration.xml
--- a/tools/gatk/variant_filtration.xml
+++ b/tools/gatk/variant_filtration.xml
@@ -110,6 +110,7 @@
<options from_data_table="gatk_picard_indexes"><filter type="data_meta" key="dbkey" ref="input_variant" column="dbkey"/></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/variant_recalibrator.xml
--- a/tools/gatk/variant_recalibrator.xml
+++ b/tools/gatk/variant_recalibrator.xml
@@ -156,6 +156,7 @@
<options from_data_table="gatk_picard_indexes"><!-- <filter type="data_meta" key="dbkey" ref="variants[0].input_variants" column="dbkey"/> --></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
diff -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 -r c6f70cccce287702f18c518175bec4a5e8ace23c tools/gatk/variants_validate.xml
--- a/tools/gatk/variants_validate.xml
+++ b/tools/gatk/variants_validate.xml
@@ -96,6 +96,7 @@
<options from_data_table="gatk_picard_indexes"><filter type="data_meta" key="dbkey" ref="input_variant" column="dbkey"/></options>
+ <validator type="no_options" message="A built-in reference genome is not available for the build associated with the selected input file"/></param></when><when value="history"><!-- FIX ME!!!! -->
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: jgoecks: Provide chromInfo correctly during conversions of datasets with custom build dbkeys.
by Bitbucket 30 Nov '11
by Bitbucket 30 Nov '11
30 Nov '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/2ee4a6b8419b/
changeset: 2ee4a6b8419b
user: jgoecks
date: 2011-11-30 19:06:12
summary: Provide chromInfo correctly during conversions of datasets with custom build dbkeys.
affected #: 1 file
diff -r 665853dfd9e12e137f26777d1a2d2167df4b4275 -r 2ee4a6b8419b3da2b95e1600a92099fe547cd070 lib/galaxy/tools/actions/__init__.py
--- a/lib/galaxy/tools/actions/__init__.py
+++ b/lib/galaxy/tools/actions/__init__.py
@@ -191,7 +191,16 @@
db_datasets[ "chromInfo" ] = db_dataset
incoming[ "chromInfo" ] = db_dataset.file_name
else:
- incoming[ "chromInfo" ] = os.path.join( trans.app.config.tool_data_path, 'shared','ucsc','chrom', "%s.len" % input_dbkey )
+ # For custom builds, chrom info resides in converted dataset; for built-in builds, chrom info resides in tool-data/shared.
+ if trans.user and ( input_dbkey in trans.user.preferences[ 'dbkeys' ] ):
+ # Custom build.
+ custom_build_dict = from_json_string( trans.user.preferences[ 'dbkeys' ] )[ input_dbkey ]
+ build_fasta_dataset = trans.app.model.HistoryDatasetAssociation.get( custom_build_dict[ 'fasta' ] )
+ chrom_info = build_fasta_dataset.get_converted_dataset( trans, 'len' ).file_name
+ else:
+ # Default to built-in build.
+ chrom_info = os.path.join( trans.app.config.tool_data_path, 'shared','ucsc','chrom', "%s.len" % input_dbkey )
+ incoming[ "chromInfo" ] = chrom_info
inp_data.update( db_datasets )
# Determine output dataset permission/roles list
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: richard_burhans: python version of "phyloP interspecies conservation scores" tool
by Bitbucket 30 Nov '11
by Bitbucket 30 Nov '11
30 Nov '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/665853dfd9e1/
changeset: 665853dfd9e1
user: richard_burhans
date: 2011-11-30 18:05:59
summary: python version of "phyloP interspecies conservation scores" tool
affected #: 3 files
diff -r 36998f5a7ddb12ee93068b8001a09b84a47944f4 -r 665853dfd9e12e137f26777d1a2d2167df4b4275 tool-data/add_scores.loc.sample
--- a/tool-data/add_scores.loc.sample
+++ b/tool-data/add_scores.loc.sample
@@ -1,21 +1,20 @@
-#This is a sample file distributed with Galaxy that enables tools to use a
-#directory of gzipped genome files for use with add_scores. You will need to
-#supply these files and then create a add_scores.loc file similar to this one
-#(store it in this directory) that points to the directories in which those
-#files are stored. The add_scores.loc file has this format (white space
-#characters are TAB characters):
+#This is a sample file distributed with Galaxy that lists the BigWig files
+#available for use with the add_scores (phyloP interspecies conservation
+#scores) tool. You will need to supply these BigWig files and then create
+#an add_scores.loc file similar to this one (store it in this directory)
+#that lists their locations. The add_scores.loc file has the following
+#format (white space characters are TAB characters):
#
-#<build><file_path>
+#<build><BigWig_file_path>
#
#So, for example, if your add_scores.loc began like this:
#
-#hg18 /galaxy/data/hg18/misc/phyloP
+#hg18 /galaxy/data/hg18/misc/phyloP44way.primate.bw
#
-#then your /galaxy/data/hg18/misc/phyloP directory would need to contain
-#the following gzipped files, among others:
+#then your /galaxy/data/hg18/misc/ directory would need to contain a
+#BigWig file named phyloP44way.primate.bw, among others:
#
-#-rw-r--r-- 1 g2data g2data 161981190 2010-03-19 12:48 chr10.phyloP44way.primate.wigFix.gz
-#-rw-r--r-- 1 g2data g2data 54091 2010-03-19 12:56 chr10_random.phyloP44way.primate.wigFix.gz
-#-rw-r--r-- 1 g2data g2data 158621990 2010-03-19 12:46 chr11.phyloP44way.primate.wigFix.gz
+#-rw-r--r-- 1 g2data g2data 6057387572 Nov 23 10:11 phyloP44way.primate.bw
#
-#hg18 /galaxy/data/hg18/misc/phyloP
+#hg18 /galaxy/data/hg18/misc/phyloP44way.primate.bw
+#hg19 /galaxy/data/hg19/misc/phyloP46way.primate.bw
diff -r 36998f5a7ddb12ee93068b8001a09b84a47944f4 -r 665853dfd9e12e137f26777d1a2d2167df4b4275 tools/evolution/add_scores.py
--- /dev/null
+++ b/tools/evolution/add_scores.py
@@ -0,0 +1,120 @@
+#!/usr/bin/env python
+
+import sys
+from galaxy import eggs
+import pkg_resources
+pkg_resources.require( "bx-python" )
+pkg_resources.require( "numpy" )
+from bx.bbi.bigwig_file import BigWigFile
+import os
+
+################################################################################
+
+def die( message ):
+ print >> sys.stderr, message
+ sys.exit(1)
+
+def open_or_die( filename, mode='r', message=None ):
+ if message is None:
+ message = 'Error opening {0}'.format( filename )
+ try:
+ fh = open( filename, mode )
+ except IOError, err:
+ die( '{0}: {1}'.format( message, err.strerror ) )
+ return fh
+
+################################################################################
+
+class LocationFile( object ):
+ def __init__( self, filename, comment_chars=None, delimiter='\t', key_column=0 ):
+ self.filename = filename
+ if comment_chars is None:
+ self.comment_chars = ( '#' )
+ else:
+ self.comment_chars = tuple( comment_chars )
+ self.delimiter = delimiter
+ self.key_column = key_column
+ self._map = {}
+ self._populate_map()
+
+ def _populate_map( self ):
+ try:
+ with open( self.filename ) as fh:
+ line_number = 0
+ for line in fh:
+ line_number += 1
+ line = line.rstrip( '\r\n' )
+ if not line.startswith( self.comment_chars ):
+ elems = line.split( self.delimiter )
+ if len( elems ) <= self.key_column:
+ die( 'Location file {0} line {1}: less than {2} columns'.format( self.filename, line_number, self.key_column + 1 ) )
+ else:
+ key = elems.pop( self.key_column )
+ if key in self._map:
+ if self._map[key] != elems:
+ die( 'Location file {0} line {1}: duplicate key "{2}"'.format( self.filename, line_number, key ) )
+ else:
+ self._map[key] = elems
+ except IOError, err:
+ die( 'Error opening location file {0}: {1}'.format( self.filename, err.strerror ) )
+
+ def get_values( self, key ):
+ if key in self._map:
+ rval = self._map[key]
+ if len( rval ) == 1:
+ return rval[0]
+ else:
+ return rval
+ else:
+ die( 'key "{0}" not found in location file {1}'.format( key, self.filename ) )
+
+################################################################################
+
+def main():
+ input_filename, output_filename, loc_filename, loc_key, chrom_col, start_col = sys.argv[1:]
+
+ # open input, output, and bigwig files
+ location_file = LocationFile( loc_filename )
+ bigwig_filename = location_file.get_values( loc_key )
+ bwfh = open_or_die( bigwig_filename, message='Error opening BigWig file {0}'.format( bigwig_filename ) )
+ bw = BigWigFile( file=bwfh )
+ ifh = open_or_die( input_filename, message='Error opening input file {0}'.format( input_filename ) )
+ ofh = open_or_die( output_filename, mode='w', message='Error opening output file {0}'.format( output_filename ) )
+
+ # make column numbers 0-based
+ chrom_col = int( chrom_col ) - 1
+ start_col = int( start_col ) - 1
+ min_cols = max( chrom_col, start_col )
+
+ # add score column to imput file
+ line_number = 0
+ for line in ifh:
+ line_number += 1
+ line = line.rstrip( '\r\n' )
+ elems = line.split( '\t' )
+ if len( elems ) > min_cols:
+ chrom = elems[chrom_col].strip()
+ # base-0 position in chrom
+ start = int( elems[start_col] )
+ score_list = bw.get( chrom, start, start + 1 )
+ score_list_len = len( score_list )
+ if score_list_len == 1:
+ beg, end, score = score_list[0]
+ score_val = '{0:1.3f}'.format( score )
+ elif score_list_len == 0:
+ score_val = 'NA'
+ else:
+ die( '{0} line {1}: chrom={2}, start={3}, score_list_len = {4}'.format( input_filename, line_number, chrom, start, score_list_len ) )
+ print >> ofh, '\t'.join( [line, score_val] )
+ else:
+ print >> ofh, line
+
+ bwfh.close()
+ ifh.close()
+ ofh.close()
+
+################################################################################
+
+if __name__ == "__main__":
+ main()
+
diff -r 36998f5a7ddb12ee93068b8001a09b84a47944f4 -r 665853dfd9e12e137f26777d1a2d2167df4b4275 tools/evolution/add_scores.xml
--- a/tools/evolution/add_scores.xml
+++ b/tools/evolution/add_scores.xml
@@ -1,8 +1,8 @@
<tool id="hgv_add_scores" name="phyloP" version="1.0.0"><description>interspecies conservation scores</description>
- <command>
- add_scores $input1 ${input1.metadata.dbkey} ${input1.metadata.chromCol} ${input1.metadata.startCol} ${GALAXY_DATA_INDEX_DIR}/add_scores.loc $out_file1
+ <command interpreter="python">
+ add_scores.py "$input1" "$out_file1" "${GALAXY_DATA_INDEX_DIR}/add_scores.loc" "${input1.metadata.dbkey}" "${input1.metadata.chromCol}" "${input1.metadata.startCol}"
</command><inputs>
@@ -34,7 +34,7 @@
<help>
.. class:: warningmark
-This currently works only for build hg18.
+This currently works only for builds hg18 and hg19.
-----
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: dan: Fix for frames inside of frames issue introduced in 5db0da0007fc. Seen when require_login=True.
by Bitbucket 30 Nov '11
by Bitbucket 30 Nov '11
30 Nov '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/36998f5a7ddb/
changeset: 36998f5a7ddb
user: dan
date: 2011-11-30 16:20:15
summary: Fix for frames inside of frames issue introduced in 5db0da0007fc. Seen when require_login=True.
affected #: 1 file
diff -r 834de698004e1306b4424d0a8dbd3e7c1938ee9c -r 36998f5a7ddb12ee93068b8001a09b84a47944f4 lib/galaxy/web/framework/__init__.py
--- a/lib/galaxy/web/framework/__init__.py
+++ b/lib/galaxy/web/framework/__init__.py
@@ -407,6 +407,7 @@
def _ensure_logged_in_user( self, environ, session_cookie ):
# The value of session_cookie can be one of
# 'galaxysession' or 'galaxycommunitysession'
+ # Currently this method does nothing unless session_cookie is 'galaxysession'
if session_cookie == 'galaxysession':
# TODO: re-engineer to eliminate the use of allowed_paths
# as maintenance overhead is far too high.
@@ -436,8 +437,8 @@
host = None
if host in UCSC_SERVERS:
return
- if self.request.path not in allowed_paths:
- self.response.send_redirect( url_for( controller='root', action='index' ) )
+ if self.request.path not in allowed_paths:
+ self.response.send_redirect( url_for( controller='root', action='index' ) )
def __create_new_session( self, prev_galaxy_session=None, user_for_new_session=None ):
"""
Create a new GalaxySession for this request, possibly with a connection
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

30 Nov '11
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/834de698004e/
changeset: 834de698004e
user: greg
date: 2011-11-30 15:16:37
summary: Revert 2 changes made in 5db0da0007fc in an attempt to correct the broken behavior of rendering Galaxy correctly when using apache proxying, possibly in adition to requiring users to login.
affected #: 2 files
diff -r de03f821784bcbe04a7514ad3d8e43fb1b635e4b -r 834de698004e1306b4424d0a8dbd3e7c1938ee9c lib/galaxy/web/controllers/user.py
--- a/lib/galaxy/web/controllers/user.py
+++ b/lib/galaxy/web/controllers/user.py
@@ -358,7 +358,7 @@
else:
refresh_frames = [ 'masthead', 'history' ]
message, status, user, success = self.__validate_login( trans, webapp, **kwd )
- if success and referer and not referer.startswith( url_for( trans.request.base + url_for( controller='user', action='logout' ) ) ):
+ if success and referer and not referer.startswith( trans.request.base + url_for( controller='user', action='logout' ) ):
redirect_url = referer
elif success:
redirect_url = url_for( '/' )
diff -r de03f821784bcbe04a7514ad3d8e43fb1b635e4b -r 834de698004e1306b4424d0a8dbd3e7c1938ee9c templates/webapps/galaxy/base_panels.mako
--- a/templates/webapps/galaxy/base_panels.mako
+++ b/templates/webapps/galaxy/base_panels.mako
@@ -142,7 +142,11 @@
menu_options.append( [ _('Preferences'), h.url_for( controller='/user', action='index', cntrller='user', webapp='galaxy' ), "galaxy_main" ] )
if app.config.get_bool( 'enable_tracks', False ):
menu_options.append( [ 'Custom Builds', h.url_for( controller='/user', action='dbkeys' ), "galaxy_main" ] )
- menu_options.append( [ 'Logout', h.url_for( controller='/user', action='logout', webapp='galaxy' ), "_top" ] )
+ if app.config.require_login:
+ logout_url = h.url_for( controller='/root', action='index', m_c='user', m_a='logout', webapp='galaxy' )
+ else:
+ logout_url = h.url_for( controller='/user', action='logout', webapp='galaxy' )
+ menu_options.append( [ 'Logout', logout_url, "_top" ] )
menu_options.append( None )
menu_options.append( [ _('Saved Histories'), h.url_for( controller='/history', action='list' ), "galaxy_main" ] )
menu_options.append( [ _('Saved Datasets'), h.url_for( controller='/dataset', action='list' ), "galaxy_main" ] )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0