galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
April 2013
- 1 participants
- 197 discussions
commit/galaxy-central: dannon: Close branch from Bjorn's error-message-fix pull request
by commits-noreply@bitbucket.org 09 Apr '13
by commits-noreply@bitbucket.org 09 Apr '13
09 Apr '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/4e08f4804b75/
Changeset: 4e08f4804b75
Branch: error-message-fix
User: dannon
Date: 2013-04-09 19:25:03
Summary: Close branch from Bjorn's error-message-fix pull request
Affected #: 0 files
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dannon: Add Parsley egg prior to merging search API.
by commits-noreply@bitbucket.org 09 Apr '13
by commits-noreply@bitbucket.org 09 Apr '13
09 Apr '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/c9cbd395ed49/
Changeset: c9cbd395ed49
User: dannon
Date: 2013-04-09 19:20:29
Summary: Add Parsley egg prior to merging search API.
Affected #: 1 file
diff -r af8a76870774ee664331dc018c4ad85cfe06b871 -r c9cbd395ed49dc8e4e211648c96ed1dcb5530aa4 eggs.ini
--- a/eggs.ini
+++ b/eggs.ini
@@ -45,6 +45,7 @@
nose = 0.11.1
NoseHTML = 0.4.1
NoseTestDiff = 0.1
+Parsley = 1.1
Paste = 1.7.5.1
PasteDeploy = 1.5.0
pexpect = 2.4
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Change database logging of stderr and stdout to take text from start and end of the string (instead of just the start) when the size exceeds the set character limit (32k).
by commits-noreply@bitbucket.org 09 Apr '13
by commits-noreply@bitbucket.org 09 Apr '13
09 Apr '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/af8a76870774/
Changeset: af8a76870774
User: dan
Date: 2013-04-09 19:00:47
Summary: Change database logging of stderr and stdout to take text from start and end of the string (instead of just the start) when the size exceeds the set character limit (32k).
Affected #: 2 files
diff -r 2cfc5c8223ef102c320472cb84f197ca28a064d6 -r af8a76870774ee664331dc018c4ad85cfe06b871 lib/galaxy/jobs/__init__.py
--- a/lib/galaxy/jobs/__init__.py
+++ b/lib/galaxy/jobs/__init__.py
@@ -34,6 +34,9 @@
# and should eventually become API'd
TOOL_PROVIDED_JOB_METADATA_FILE = 'galaxy.json'
+DATABASE_MAX_STRING_SIZE = 32768
+DATABASE_MAX_STRING_SIZE_PRETTY = '32K'
+
class Sleeper( object ):
"""
Provides a 'sleep' method that sleeps for a number of seconds *unless*
@@ -774,13 +777,13 @@
job.info = message
# TODO: Put setting the stdout, stderr, and exit code in one place
# (not duplicated with the finish method).
- if ( len( stdout ) > 32768 ):
- stdout = stdout[:32768]
- log.info( "stdout for job %d is greater than 32K, only first part will be logged to database" % job.id )
+ if ( len( stdout ) > DATABASE_MAX_STRING_SIZE ):
+ stdout = util.shrink_string_by_size( stdout, DATABASE_MAX_STRING_SIZE, join_by="\n..\n", left_larger=True, beginning_on_size_error=True )
+ log.info( "stdout for job %d is greater than %s, only a portion will be logged to database" % ( job.id, DATABASE_MAX_STRING_SIZE_PRETTY ) )
job.stdout = stdout
- if ( len( stderr ) > 32768 ):
- stderr = stderr[:32768]
- log.info( "stderr for job %d is greater than 32K, only first part will be logged to database" % job.id )
+ if ( len( stderr ) > DATABASE_MAX_STRING_SIZE ):
+ stderr = util.shrink_string_by_size( stderr, DATABASE_MAX_STRING_SIZE, join_by="\n..\n", left_larger=True, beginning_on_size_error=True )
+ log.info( "stderr for job %d is greater than %s, only a portion will be logged to database" % ( job.id, DATABASE_MAX_STRING_SIZE_PRETTY ) )
job.stderr = stderr
# Let the exit code be Null if one is not provided:
if ( exit_code != None ):
@@ -998,12 +1001,12 @@
# will now be seen by the user.
self.sa_session.flush()
# Save stdout and stderr
- if len( job.stdout ) > 32768:
- log.info( "stdout for job %d is greater than 32K, only first part will be logged to database" % job.id )
- job.stdout = job.stdout[:32768]
- if len( job.stderr ) > 32768:
- log.info( "stderr for job %d is greater than 32K, only first part will be logged to database" % job.id )
- job.stderr = job.stderr[:32768]
+ if len( job.stdout ) > DATABASE_MAX_STRING_SIZE:
+ log.info( "stdout for job %d is greater than %s, only a portion will be logged to database" % ( job.id, DATABASE_MAX_STRING_SIZE_PRETTY ) )
+ job.stdout = util.shrink_string_by_size( job.stdout, DATABASE_MAX_STRING_SIZE, join_by="\n..\n", left_larger=True, beginning_on_size_error=True )
+ if len( job.stderr ) > DATABASE_MAX_STRING_SIZE:
+ log.info( "stderr for job %d is greater than %s, only a portion will be logged to database" % ( job.id, DATABASE_MAX_STRING_SIZE_PRETTY ) )
+ job.stderr = util.shrink_string_by_size( job.stderr, DATABASE_MAX_STRING_SIZE, join_by="\n..\n", left_larger=True, beginning_on_size_error=True )
# The exit code will be null if there is no exit code to be set.
# This is so that we don't assign an exit code, such as 0, that
# is either incorrect or has the wrong semantics.
@@ -1652,12 +1655,12 @@
task.state = task.states.ERROR
# Save stdout and stderr
- if len( stdout ) > 32768:
- log.error( "stdout for task %d is greater than 32K, only first part will be logged to database" % task.id )
- task.stdout = stdout[:32768]
- if len( stderr ) > 32768:
- log.error( "stderr for job %d is greater than 32K, only first part will be logged to database" % task.id )
- task.stderr = stderr[:32768]
+ if len( stdout ) > DATABASE_MAX_STRING_SIZE:
+ log.error( "stdout for task %d is greater than %s, only a portion will be logged to database" % ( task.id, DATABASE_MAX_STRING_SIZE_PRETTY ) )
+ task.stdout = util.shrink_string_by_size( stdout, DATABASE_MAX_STRING_SIZE, join_by="\n..\n", left_larger=True, beginning_on_size_error=True )
+ if len( stderr ) > DATABASE_MAX_STRING_SIZE:
+ log.error( "stderr for task %d is greater than %s, only a portion will be logged to database" % ( task.id, DATABASE_MAX_STRING_SIZE_PRETTY ) )
+ task.stderr = util.shrink_string_by_size( stderr, DATABASE_MAX_STRING_SIZE, join_by="\n..\n", left_larger=True, beginning_on_size_error=True )
task.exit_code = tool_exit_code
task.command_line = self.command_line
self.sa_session.flush()
diff -r 2cfc5c8223ef102c320472cb84f197ca28a064d6 -r af8a76870774ee664331dc018c4ad85cfe06b871 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -152,6 +152,25 @@
elem.tail = i + pad
return elem
+def shrink_string_by_size( value, size, join_by="..", left_larger=True, beginning_on_size_error=False, end_on_size_error=False ):
+ if len( value ) > size:
+ len_join_by = len( join_by )
+ min_size = len_join_by + 2
+ if size < min_size:
+ if beginning_on_size_error:
+ return value[:size]
+ elif end_on_size_error:
+ return value[-size:]
+ raise ValueError( 'With the provided join_by value (%s), the minimum size value is %i.' % ( join_by, min_size ) )
+ left_index = right_index = int( ( size - len_join_by ) / 2 )
+ if left_index + right_index + len_join_by < size:
+ if left_larger:
+ left_index += 1
+ else:
+ right_index += 1
+ value = "%s%s%s" % ( value[:left_index], join_by, value[-right_index:] )
+ return value
+
# characters that are valid
valid_chars = set(string.letters + string.digits + " -=_.()/+*^,:?!")
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: dan: Fix for generating stderr/stdout links in hda show_params.
by commits-noreply@bitbucket.org 09 Apr '13
by commits-noreply@bitbucket.org 09 Apr '13
09 Apr '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/2cfc5c8223ef/
Changeset: 2cfc5c8223ef
User: dan
Date: 2013-04-09 18:26:05
Summary: Fix for generating stderr/stdout links in hda show_params.
Affected #: 1 file
diff -r 95bf71620d50c6d81248eef2001a7dc156ae1088 -r 2cfc5c8223ef102c320472cb84f197ca28a064d6 templates/show_params.mako
--- a/templates/show_params.mako
+++ b/templates/show_params.mako
@@ -105,6 +105,9 @@
</th></tr></thead><tbody>
+ <%
+ encoded_hda_id = trans.security.encode_id( hda.id )
+ %><tr><td>Name:</td><td>${hda.name | h}</td></tr><tr><td>Created:</td><td>${hda.create_time.strftime("%b %d, %Y")}</td></tr>
## <tr><td>Copied from another history?</td><td>${hda.source_library_dataset}</td></tr>
@@ -113,10 +116,10 @@
<tr><td>Format:</td><td>${hda.ext | h}</td></tr><tr><td>Galaxy Tool Version:</td><td>${job.tool_version | h}</td></tr><tr><td>Tool Version:</td><td>${hda.tool_version | h}</td></tr>
- <tr><td>Tool Standard Output:</td><td><a href="${h.url_for( controller='dataset', action='stdout')}">stdout</a></td></tr>
- <tr><td>Tool Standard Error:</td><td><a href="${h.url_for( controller='dataset', action='stderr')}">stderr</a></td></tr>
+ <tr><td>Tool Standard Output:</td><td><a href="${h.url_for( controller='dataset', action='stdout', dataset_id=encoded_hda_id )}">stdout</a></td></tr>
+ <tr><td>Tool Standard Error:</td><td><a href="${h.url_for( controller='dataset', action='stderr', dataset_id=encoded_hda_id )}">stderr</a></td></tr><tr><td>Tool Exit Code:</td><td>${job.exit_code | h}</td></tr>
- <tr><td>API ID:</td><td>${trans.security.encode_id(hda.id)}</td></tr>
+ <tr><td>API ID:</td><td>${encoded_hda_id}</td></tr>
%if trans.user_is_admin() or trans.app.config.expose_dataset_path:
<tr><td>Full Path:</td><td>${hda.file_name | h}</td></tr>
%endif
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: james_taylor: Require Python 2.6+ and remove various compatibility patches for 2.4 and 2.5
by commits-noreply@bitbucket.org 09 Apr '13
by commits-noreply@bitbucket.org 09 Apr '13
09 Apr '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/95bf71620d50/
Changeset: 95bf71620d50
User: james_taylor
Date: 2013-04-09 17:57:38
Summary: Require Python 2.6+ and remove various compatibility patches for 2.4 and 2.5
Affected #: 20 files
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 eggs.ini
--- a/eggs.ini
+++ b/eggs.ini
@@ -14,7 +14,6 @@
[eggs:platform]
bx_python = 0.7.1
Cheetah = 2.2.2
-ctypes = 1.0.2
DRMAA_python = 0.2
MarkupSafe = 0.12
mercurial = 2.2.3
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/fpconst.py
--- a/lib/fpconst.py
+++ /dev/null
@@ -1,163 +0,0 @@
-"""Utilities for handling IEEE 754 floating point special values
-
-This python module implements constants and functions for working with
-IEEE754 double-precision special values. It provides constants for
-Not-a-Number (NaN), Positive Infinity (PosInf), and Negative Infinity
-(NegInf), as well as functions to test for these values.
-
-The code is implemented in pure python by taking advantage of the
-'struct' standard module. Care has been taken to generate proper
-results on both big-endian and little-endian machines. Some efficiency
-could be gained by translating the core routines into C.
-
-See <http://babbage.cs.qc.edu/courses/cs341/IEEE-754references.html>
-for reference material on the IEEE 754 floating point standard.
-
-Further information on this package is available at
-<http://www.analytics.washington.edu/statcomp/projects/rzope/fpconst/>.
-
-Author: Gregory R. Warnes <gregory_r_warnes(a)groton.pfizer.com>
-Date:: 2003-04-08
-Copyright: (c) 2003, Pfizer, Inc.
-"""
-
-__version__ = "0.7.0"
-ident = "$Id: fpconst.py,v 1.12 2004/05/22 04:38:17 warnes Exp $"
-
-import struct, operator
-
-# check endianess
-_big_endian = struct.pack('i',1)[0] != '\x01'
-
-# and define appropriate constants
-if(_big_endian):
- NaN = struct.unpack('d', '\x7F\xF8\x00\x00\x00\x00\x00\x00')[0]
- PosInf = struct.unpack('d', '\x7F\xF0\x00\x00\x00\x00\x00\x00')[0]
- NegInf = -PosInf
-else:
- NaN = struct.unpack('d', '\x00\x00\x00\x00\x00\x00\xf8\xff')[0]
- PosInf = struct.unpack('d', '\x00\x00\x00\x00\x00\x00\xf0\x7f')[0]
- NegInf = -PosInf
-
-def _double_as_bytes(dval):
- "Use struct.unpack to decode a double precision float into eight bytes"
- tmp = list(struct.unpack('8B',struct.pack('d', dval)))
- if not _big_endian:
- tmp.reverse()
- return tmp
-
-##
-## Functions to extract components of the IEEE 754 floating point format
-##
-
-def _sign(dval):
- "Extract the sign bit from a double-precision floating point value"
- bb = _double_as_bytes(dval)
- return bb[0] >> 7 & 0x01
-
-def _exponent(dval):
- """Extract the exponentent bits from a double-precision floating
- point value.
-
- Note that for normalized values, the exponent bits have an offset
- of 1023. As a consequence, the actual exponentent is obtained
- by subtracting 1023 from the value returned by this function
- """
- bb = _double_as_bytes(dval)
- return (bb[0] << 4 | bb[1] >> 4) & 0x7ff
-
-def _mantissa(dval):
- """Extract the _mantissa bits from a double-precision floating
- point value."""
-
- bb = _double_as_bytes(dval)
- mantissa = bb[1] & 0x0f << 48
- mantissa += bb[2] << 40
- mantissa += bb[3] << 32
- mantissa += bb[4]
- return mantissa
-
-def _zero_mantissa(dval):
- """Determine whether the mantissa bits of the given double are all
- zero."""
- bb = _double_as_bytes(dval)
- return ((bb[1] & 0x0f) | reduce(operator.or_, bb[2:])) == 0
-
-##
-## Functions to test for IEEE 754 special values
-##
-
-def isNaN(value):
- "Determine if the argument is a IEEE 754 NaN (Not a Number) value."
- return (_exponent(value)==0x7ff and not _zero_mantissa(value))
-
-def isInf(value):
- """Determine if the argument is an infinite IEEE 754 value (positive
- or negative inifinity)"""
- return (_exponent(value)==0x7ff and _zero_mantissa(value))
-
-def isFinite(value):
- """Determine if the argument is an finite IEEE 754 value (i.e., is
- not NaN, positive or negative inifinity)"""
- return (_exponent(value)!=0x7ff)
-
-def isPosInf(value):
- "Determine if the argument is a IEEE 754 positive infinity value"
- return (_sign(value)==0 and _exponent(value)==0x7ff and \
- _zero_mantissa(value))
-
-def isNegInf(value):
- "Determine if the argument is a IEEE 754 negative infinity value"
- return (_sign(value)==1 and _exponent(value)==0x7ff and \
- _zero_mantissa(value))
-
-##
-## Functions to test public functions.
-##
-
-def test_isNaN():
- assert( not isNaN(PosInf) )
- assert( not isNaN(NegInf) )
- assert( isNaN(NaN ) )
- assert( not isNaN( 1.0) )
- assert( not isNaN( -1.0) )
-
-def test_isInf():
- assert( isInf(PosInf) )
- assert( isInf(NegInf) )
- assert( not isInf(NaN ) )
- assert( not isInf( 1.0) )
- assert( not isInf( -1.0) )
-
-def test_isFinite():
- assert( not isFinite(PosInf) )
- assert( not isFinite(NegInf) )
- assert( not isFinite(NaN ) )
- assert( isFinite( 1.0) )
- assert( isFinite( -1.0) )
-
-def test_isPosInf():
- assert( isPosInf(PosInf) )
- assert( not isPosInf(NegInf) )
- assert( not isPosInf(NaN ) )
- assert( not isPosInf( 1.0) )
- assert( not isPosInf( -1.0) )
-
-def test_isNegInf():
- assert( not isNegInf(PosInf) )
- assert( isNegInf(NegInf) )
- assert( not isNegInf(NaN ) )
- assert( not isNegInf( 1.0) )
- assert( not isNegInf( -1.0) )
-
-# overall test
-def test():
- test_isNaN()
- test_isInf()
- test_isFinite()
- test_isPosInf()
- test_isNegInf()
-
-if __name__ == "__main__":
- test()
-
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/__init__.py
--- a/lib/galaxy/__init__.py
+++ b/lib/galaxy/__init__.py
@@ -95,10 +95,15 @@
pkg_resources.Distribution._insert_on = pkg_resources.Distribution.insert_on
pkg_resources.Distribution.insert_on = _insert_on
-# patch to add the NullHandler class to logging
-if sys.version_info[:2] < ( 2, 7 ):
- import logging
+# compat: BadZipFile introduced in Python 2.7
+import zipfile
+if not hasattr( zipfile, 'BadZipFile' ):
+ zipfile.BadZipFile = zipfile.error
+
+# compat: patch to add the NullHandler class to logging
+import logging
+if not hasattr( logging, 'NullHandler' ):
class NullHandler( logging.Handler ):
def emit( self, record ):
pass
- logging.NullHandler = NullHandler
+ logging.NullHandler = NullHandler
\ No newline at end of file
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/datatypes/data.py
--- a/lib/galaxy/datatypes/data.py
+++ b/lib/galaxy/datatypes/data.py
@@ -17,12 +17,6 @@
eggs.require( "Paste" )
import paste
-
-if sys.version_info[:2] < ( 2, 6 ):
- zipfile.BadZipFile = zipfile.error
-if sys.version_info[:2] < ( 2, 5 ):
- zipfile.LargeZipFile = zipfile.error
-
log = logging.getLogger(__name__)
tmpd = tempfile.mkdtemp()
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/eggs/__init__.py
--- a/lib/galaxy/eggs/__init__.py
+++ b/lib/galaxy/eggs/__init__.py
@@ -387,7 +387,6 @@
"guppy": lambda: self.config.get( "app:main", "use_memdump" ),
"python_openid": lambda: self.config.get( "app:main", "enable_openid" ),
"python_daemon": lambda: sys.version_info[:2] >= ( 2, 5 ),
- "ctypes": lambda: ( "drmaa" in self.config.get( "app:main", "start_job_runners" ).split(",") ) and sys.version_info[:2] == ( 2, 4 ),
"pysam": lambda: check_pysam()
}.get( egg_name, lambda: True )()
except:
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/jobs/deferred/genome_transfer.py
--- a/lib/galaxy/jobs/deferred/genome_transfer.py
+++ b/lib/galaxy/jobs/deferred/genome_transfer.py
@@ -165,28 +165,18 @@
for name in z.namelist():
if name.endswith('/'):
continue
- if sys.version_info[:2] >= ( 2, 6 ):
- zipped_file = z.open( name )
- while 1:
- try:
- chunk = zipped_file.read( CHUNK_SIZE )
- except IOError:
- os.close( fd )
- log.error( 'Problem decompressing zipped data' )
- return self.app.model.DeferredJob.states.INVALID
- if not chunk:
- break
- os.write( fd, chunk )
- zipped_file.close()
- else:
+ zipped_file = z.open( name )
+ while 1:
try:
- outfile = open( fd, 'wb' )
- outfile.write( z.read( name ) )
- outfile.close()
+ chunk = zipped_file.read( CHUNK_SIZE )
except IOError:
os.close( fd )
log.error( 'Problem decompressing zipped data' )
- return
+ return self.app.model.DeferredJob.states.INVALID
+ if not chunk:
+ break
+ os.write( fd, chunk )
+ zipped_file.close()
os.close( fd )
z.close()
elif data_type == 'fasta':
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/jobs/runners/drmaa.py
--- a/lib/galaxy/jobs/runners/drmaa.py
+++ b/lib/galaxy/jobs/runners/drmaa.py
@@ -15,8 +15,6 @@
from galaxy.jobs import JobDestination
from galaxy.jobs.runners import AsynchronousJobState, AsynchronousJobRunner
-if sys.version_info[:2] == ( 2, 4 ):
- eggs.require( "ctypes" )
eggs.require( "drmaa" )
# We foolishly named this file the same as the name exported by the drmaa
# library... 'import drmaa' imports itself.
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/objectstore/__init__.py
--- a/lib/galaxy/objectstore/__init__.py
+++ b/lib/galaxy/objectstore/__init__.py
@@ -21,13 +21,12 @@
from sqlalchemy.orm import object_session
-if sys.version_info >= (2, 6):
- import multiprocessing
- from galaxy.objectstore.s3_multipart_upload import multipart_upload
- import boto
- from boto.s3.key import Key
- from boto.s3.connection import S3Connection
- from boto.exception import S3ResponseError
+import multiprocessing
+from galaxy.objectstore.s3_multipart_upload import multipart_upload
+import boto
+from boto.s3.key import Key
+from boto.s3.connection import S3Connection
+from boto.exception import S3ResponseError
log = logging.getLogger( __name__ )
logging.getLogger('boto').setLevel(logging.INFO) # Otherwise boto is quite noisy
@@ -381,7 +380,6 @@
Galaxy and S3.
"""
def __init__(self, config):
- assert sys.version_info >= (2, 6), 'S3 Object Store support requires Python >= 2.6'
super(S3ObjectStore, self).__init__()
self.config = config
self.staging_path = self.config.file_path
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/objectstore/s3_multipart_upload.py
--- a/lib/galaxy/objectstore/s3_multipart_upload.py
+++ b/lib/galaxy/objectstore/s3_multipart_upload.py
@@ -13,10 +13,8 @@
import contextlib
import functools
-if sys.version_info >= (2, 6):
- # this is just to prevent unit tests from failing
- import multiprocessing
- from multiprocessing.pool import IMapIterator
+import multiprocessing
+from multiprocessing.pool import IMapIterator
from galaxy import eggs
eggs.require('boto')
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -15,6 +15,8 @@
import types
import urllib
+from math import isinf
+
from galaxy import eggs
eggs.require( "simplejson" )
eggs.require( "MarkupSafe" ) #MarkupSafe must load before mako
@@ -46,7 +48,7 @@
from galaxy.tools.parameters.output import ToolOutputActionGroup
from galaxy.tools.parameters.validation import LateValidationError
from galaxy.tools.test import ToolTestBuilder
-from galaxy.util import isinf, listify, parse_xml, rst_to_html, string_as_bool, string_to_object, xml_text, xml_to_string
+from galaxy.util import listify, parse_xml, rst_to_html, string_as_bool, string_to_object, xml_text, xml_to_string
from galaxy.util.bunch import Bunch
from galaxy.util.expressions import ExpressionContext
from galaxy.util.hash_util import hmac_new
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py
+++ b/lib/galaxy/util/__init__.py
@@ -5,24 +5,8 @@
import logging, threading, random, string, re, binascii, pickle, time, datetime, math, re, os, sys, tempfile, stat, grp, smtplib, errno, shutil
from email.MIMEText import MIMEText
-# Older py compatibility
-try:
- set()
-except:
- from sets import Set as set
-
-try:
- from hashlib import md5
-except ImportError:
- from md5 import new as md5
-
-try:
- from math import isinf
-except ImportError:
- INF = float( 'inf' )
- NEG_INF = -INF
- ISINF_LIST = [ INF, NEG_INF ]
- isinf = lambda x: x in ISINF_LIST
+from os.path import relpath
+from hashlib import md5
from galaxy import eggs
import pkg_resources
@@ -543,41 +527,6 @@
print "ERROR: Unable to read builds for site file %s" %filename
return build_sites
-def relpath( path, start = None ):
- """Return a relative version of a path"""
- #modified from python 2.6.1 source code
-
- #version 2.6+ has it built in, we'll use the 'official' copy
- if sys.version_info[:2] >= ( 2, 6 ):
- if start is not None:
- return os.path.relpath( path, start )
- return os.path.relpath( path )
-
- #we need to initialize some local parameters
- curdir = os.curdir
- pardir = os.pardir
- sep = os.sep
- commonprefix = os.path.commonprefix
- join = os.path.join
- if start is None:
- start = curdir
-
- #below is the unedited (but formated) relpath() from posixpath.py of 2.6.1
- #this will likely not function properly on non-posix systems, i.e. windows
- if not path:
- raise ValueError( "no path specified" )
-
- start_list = os.path.abspath( start ).split( sep )
- path_list = os.path.abspath( path ).split( sep )
-
- # Work out how much of the filepath is shared by start and path.
- i = len( commonprefix( [ start_list, path_list ] ) )
-
- rel_list = [ pardir ] * ( len( start_list )- i ) + path_list[ i: ]
- if not rel_list:
- return curdir
- return join( *rel_list )
-
def relativize_symlinks( path, start=None, followlinks=False):
for root, dirs, files in os.walk( path, followlinks=followlinks ):
rel_start = None
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/util/pastescript/serve.py
--- a/lib/galaxy/util/pastescript/serve.py
+++ b/lib/galaxy/util/pastescript/serve.py
@@ -101,13 +101,6 @@
# (c) 2005 Ian Bicking and contributors; written for Paste (http://pythonpaste.org)
# Licensed under the MIT license: http://www.opensource.org/licenses/mit-license.php
-# if sys.version_info >= (2, 6):
-# from logging.config import fileConfig
-# else:
-# # Use our custom fileConfig -- 2.5.1's with a custom Formatter class
-# # and less strict whitespace (which were incorporated into 2.6's)
-# from paste.script.util.logging_config import fileConfig
-
class BadCommand(Exception):
def __init__(self, message, exit_code=2):
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/visualization/data_providers/genome.py
--- a/lib/galaxy/visualization/data_providers/genome.py
+++ b/lib/galaxy/visualization/data_providers/genome.py
@@ -6,8 +6,6 @@
from math import ceil, log
import pkg_resources
pkg_resources.require( "bx-python" )
-if sys.version_info[:2] == (2, 4):
- pkg_resources.require( "ctypes" )
pkg_resources.require( "pysam" )
pkg_resources.require( "numpy" )
import numpy
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/webapps/galaxy/buildapp.py
--- a/lib/galaxy/webapps/galaxy/buildapp.py
+++ b/lib/galaxy/webapps/galaxy/buildapp.py
@@ -285,12 +285,7 @@
log.debug( "Enabling 'eval exceptions' middleware" )
else:
# Not in interactive debug mode, just use the regular error middleware
- if sys.version_info[:2] >= ( 2, 6 ):
- warnings.filterwarnings( 'ignore', '.*', DeprecationWarning, '.*serial_number_generator', 11, True )
- import galaxy.web.framework.middleware.error
- warnings.filters.pop()
- else:
- import galaxy.web.framework.middleware.error
+ import galaxy.web.framework.middleware.error
app = galaxy.web.framework.middleware.error.ErrorMiddleware( app, conf )
log.debug( "Enabling 'error' middleware" )
# Transaction logging (apache access.log style)
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/webapps/galaxy/controllers/dataset.py
--- a/lib/galaxy/webapps/galaxy/controllers/dataset.py
+++ b/lib/galaxy/webapps/galaxy/controllers/dataset.py
@@ -20,11 +20,6 @@
pkg_resources.require( "Paste" )
import paste.httpexceptions
-if sys.version_info[:2] < ( 2, 6 ):
- zipfile.BadZipFile = zipfile.error
-if sys.version_info[:2] < ( 2, 5 ):
- zipfile.LargeZipFile = zipfile.error
-
tmpd = tempfile.mkdtemp()
comptypes=[]
ziptype = '32'
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/galaxy/webapps/galaxy/controllers/library_common.py
--- a/lib/galaxy/webapps/galaxy/controllers/library_common.py
+++ b/lib/galaxy/webapps/galaxy/controllers/library_common.py
@@ -27,11 +27,6 @@
whoosh_search_enabled = False
schema = None
-if sys.version_info[:2] < ( 2, 6 ):
- zipfile.BadZipFile = zipfile.error
-if sys.version_info[:2] < ( 2, 5 ):
- zipfile.LargeZipFile = zipfile.error
-
log = logging.getLogger( __name__ )
# Test for available compression types
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 lib/tool_shed/tool_shed_registry.py
--- a/lib/tool_shed/tool_shed_registry.py
+++ b/lib/tool_shed/tool_shed_registry.py
@@ -4,12 +4,7 @@
log = logging.getLogger( __name__ )
-if sys.version_info[:2] == ( 2, 4 ):
- from galaxy import eggs
- eggs.require( 'ElementTree' )
- from elementtree import ElementTree
-else:
- from xml.etree import ElementTree
+from xml.etree import ElementTree
class Registry( object ):
def __init__( self, root_dir=None, config=None ):
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 scripts/check_python.py
--- a/scripts/check_python.py
+++ b/scripts/check_python.py
@@ -1,19 +1,19 @@
"""
-If the current installed python version is not 2.5 to 2.7, prints an error
+If the current installed python version is not 2.6 to 2.7, prints an error
message to stderr and returns 1
"""
import os, sys
msg = """ERROR: Your Python version is: %s
-Galaxy is currently supported on Python 2.5, 2.6 and 2.7. To run Galaxy,
+Galaxy is currently supported on Python 2.6 and 2.7. To run Galaxy,
please download and install a supported version from python.org. If a
supported version is installed but is not your default, getgalaxy.org
contains instructions on how to force Galaxy to use a different version.""" % sys.version[:3]
def check_python():
try:
- assert sys.version_info[:2] >= ( 2, 5 ) and sys.version_info[:2] <= ( 2, 7 )
+ assert sys.version_info[:2] >= ( 2, 6 ) and sys.version_info[:2] <= ( 2, 7 )
except AssertionError:
print >>sys.stderr, msg
raise
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 tools/regVariation/quality_filter.py
--- a/tools/regVariation/quality_filter.py
+++ b/tools/regVariation/quality_filter.py
@@ -24,7 +24,7 @@
from bx.binned_array import BinnedArray, FileBinnedArray
from bx.bitset import *
from bx.bitset_builders import *
-from fpconst import isNaN
+from math import isnan
from bx.cookbook import doc_optparse
from galaxy.tools.exception_handling import *
import bx.align.maf
diff -r 24c143157b7acad82501bf8655b587034963e4b2 -r 95bf71620d50c6d81248eef2001a7dc156ae1088 tools/stats/aggregate_scores_in_intervals.py
--- a/tools/stats/aggregate_scores_in_intervals.py
+++ b/tools/stats/aggregate_scores_in_intervals.py
@@ -25,7 +25,7 @@
from bx.binned_array import BinnedArray, FileBinnedArray
from bx.bitset import *
from bx.bitset_builders import *
-from fpconst import isNaN
+from math import isnan
from bx.cookbook import doc_optparse
from galaxy.tools.exception_handling import *
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: carlfeberhard: history panel: move alternate_history.mako to history.mako
by commits-noreply@bitbucket.org 09 Apr '13
by commits-noreply@bitbucket.org 09 Apr '13
09 Apr '13
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/24c143157b7a/
Changeset: 24c143157b7a
User: carlfeberhard
Date: 2013-04-09 16:58:38
Summary: history panel: move alternate_history.mako to history.mako
Affected #: 3 files
diff -r b7a2605bd0a3c8452999b645eb3a37d876c4f2da -r 24c143157b7acad82501bf8655b587034963e4b2 lib/galaxy/webapps/galaxy/controllers/root.py
--- a/lib/galaxy/webapps/galaxy/controllers/root.py
+++ b/lib/galaxy/webapps/galaxy/controllers/root.py
@@ -164,7 +164,7 @@
}
hda_dictionaries.append( return_val )
- return trans.stream_template_mako( "root/alternate_history.mako",
+ return trans.stream_template_mako( "root/history.mako",
history_dictionary = history_dictionary,
hda_dictionaries = hda_dictionaries,
show_deleted = show_deleted,
diff -r b7a2605bd0a3c8452999b645eb3a37d876c4f2da -r 24c143157b7acad82501bf8655b587034963e4b2 templates/webapps/galaxy/root/alternate_history.mako
--- a/templates/webapps/galaxy/root/alternate_history.mako
+++ /dev/null
@@ -1,522 +0,0 @@
-<%inherit file="/base.mako"/>
-
-<%def name="title()">
- ${_('Galaxy History')}
-</%def>
-
-## ---------------------------------------------------------------------------------------------------------------------
-<%def name="create_localization_json( strings_to_localize )">
- ## converts strings_to_localize (a list of strings) into a JSON dictionary of { string : localized string }
-${ h.to_json_string( dict([ ( string, _(string) ) for string in strings_to_localize ]) ) }
-## ?? add: if string != _(string)
-</%def>
-
-<%def name="get_page_localized_strings()">
- ## a list of localized strings used in the backbone views, etc. (to be loaded and cached)
- ##! change on per page basis
- <%
- ## havent been localized
- ##[
- ## "anonymous user",
- ## "Click to rename history",
- ## "Click to see more actions",
- ## "Edit history tags",
- ## "Edit history annotation",
- ## "Tags",
- ## "Annotation",
- ## "Click to edit annotation",
- ## "You are over your disk ...w your allocated quota.",
- ## "Show deleted",
- ## "Show hidden",
- ## "View data",
- ## "Edit Attributes",
- ## "Download",
- ## "View details",
- ## "Run this job again",
- ## "Visualize",
- ## "Edit dataset tags",
- ## "Edit dataset annotation",
- ## "Trackster",
- ## "Circster",
- ## "Scatterplot",
- ## "GeneTrack",
- ## "Local",
- ## "Web",
- ## "Current",
- ## "main",
- ## "Using"
- ##]
- strings_to_localize = [
-
- # from history.mako
- # not needed?: "Galaxy History",
- 'refresh',
- 'collapse all',
- 'hide deleted',
- 'hide hidden',
- 'You are currently viewing a deleted history!',
- "Your history is empty. Click 'Get Data' on the left pane to start",
-
- # from history_common.mako
- 'Download',
- 'Display Data',
- 'View data',
- 'Edit attributes',
- 'Delete',
- 'Job is waiting to run',
- 'View Details',
- 'Run this job again',
- 'Job is currently running',
- 'View Details',
- 'Run this job again',
- 'Metadata is being Auto-Detected.',
- 'No data: ',
- 'format: ',
- 'database: ',
- #TODO localized data.dbkey??
- 'Info: ',
- #TODO localized display_app.display_name??
- # _( link_app.name )
- # localized peek...ugh
- 'Error: unknown dataset state',
- ]
- return strings_to_localize
- %>
-</%def>
-
-## ---------------------------------------------------------------------------------------------------------------------
-## all the possible history urls (primarily from web controllers at this point)
-<%def name="get_history_url_templates()">
-<%
- from urllib import unquote_plus
-
- history_class_name = 'History'
- encoded_id_template = '<%= id %>'
-
- url_dict = {
- 'rename' : h.url_for( controller="history", action="rename_async",
- id=encoded_id_template ),
- 'tag' : h.url_for( controller='tag', action='get_tagging_elt_async',
- item_class=history_class_name, item_id=encoded_id_template ),
- 'annotate' : h.url_for( controller="history", action="annotate_async",
- id=encoded_id_template )
- }
-%>
-${ unquote_plus( h.to_json_string( url_dict ) ) }
-</%def>
-
-## ---------------------------------------------------------------------------------------------------------------------
-## all the possible hda urls (primarily from web controllers at this point) - whether they should have them or not
-##TODO: unify url_for btwn web, api
-<%def name="get_hda_url_templates()">
-<%
- from urllib import unquote_plus
-
- hda_class_name = 'HistoryDatasetAssociation'
- encoded_id_template = '<%= id %>'
-
- hda_ext_template = '<%= file_ext %>'
- meta_type_template = '<%= file_type %>'
-
- display_app_name_template = '<%= name %>'
- display_app_link_template = '<%= link %>'
-
- url_dict = {
- # ................................................................ warning message links
- 'purge' : h.url_for( controller='dataset', action='purge_async',
- dataset_id=encoded_id_template ),
- #TODO: hide (via api)
- 'unhide' : h.url_for( controller='dataset', action='unhide',
- dataset_id=encoded_id_template ),
- #TODO: via api
- 'undelete' : h.url_for( controller='dataset', action='undelete',
- dataset_id=encoded_id_template ),
-
- # ................................................................ title actions (display, edit, delete),
- 'display' : h.url_for( controller='dataset', action='display',
- dataset_id=encoded_id_template, preview=True, filename='' ),
- 'edit' : h.url_for( controller='dataset', action='edit',
- dataset_id=encoded_id_template ),
-
- #TODO: via api
- 'delete' : h.url_for( controller='dataset', action='delete_async', dataset_id=encoded_id_template ),
-
- # ................................................................ download links (and associated meta files),
- 'download' : h.url_for( controller='dataset', action='display',
- dataset_id=encoded_id_template, to_ext=hda_ext_template ),
- 'meta_download' : h.url_for( controller='dataset', action='get_metadata_file',
- hda_id=encoded_id_template, metadata_name=meta_type_template ),
-
- # ................................................................ primary actions (errors, params, rerun),
- 'report_error' : h.url_for( controller='dataset', action='errors',
- id=encoded_id_template ),
- 'show_params' : h.url_for( controller='dataset', action='show_params',
- dataset_id=encoded_id_template ),
- 'rerun' : h.url_for( controller='tool_runner', action='rerun',
- id=encoded_id_template ),
- 'visualization' : h.url_for( controller='visualization', action='index' ),
-
- # ................................................................ secondary actions (tagging, annotation),
- 'tags' : {
- 'get' : h.url_for( controller='tag', action='get_tagging_elt_async',
- item_class=hda_class_name, item_id=encoded_id_template ),
- 'set' : h.url_for( controller='tag', action='retag',
- item_class=hda_class_name, item_id=encoded_id_template ),
- },
- 'annotation' : {
- 'get' : h.url_for( controller='dataset', action='get_annotation_async',
- id=encoded_id_template ),
- 'set' : h.url_for( controller='/dataset', action='annotate_async',
- id=encoded_id_template ),
- },
- }
-%>
-${ unquote_plus( h.to_json_string( url_dict ) ) }
-</%def>
-
-## -----------------------------------------------------------------------------
-<%def name="get_history_json( history )">
-<%
- try:
- return h.to_json_string( history )
- except TypeError, type_err:
- log.error( 'Could not serialize history' )
- log.debug( 'history data: %s', str( history ) )
- return '{}'
-%>
-</%def>
-
-<%def name="get_current_user()">
-<%
- user_json = trans.webapp.api_controllers[ 'users' ].show( trans, 'current' )
- return user_json
-%>
-</%def>
-
-<%def name="get_hda_json( hdas )">
-<%
- try:
- return h.to_json_string( hdas )
- except TypeError, type_err:
- log.error( 'Could not serialize hdas for history: %s', history['id'] )
- log.debug( 'hda data: %s', str( hdas ) )
- return '{}'
-%>
-</%def>
-
-
-## -----------------------------------------------------------------------------
-<%def name="javascripts()">
-${parent.javascripts()}
-
-${h.js(
- "libs/jquery/jstorage",
- "libs/jquery/jquery.autocomplete", "galaxy.autocom_tagging",
- "mvc/base-mvc",
-)}
-
-${h.templates(
- "helpers-common-templates",
- "template-warningmessagesmall",
-
- "template-history-historyPanel",
-
- "template-hda-warning-messages",
- "template-hda-titleLink",
- "template-hda-failedMetadata",
- "template-hda-hdaSummary",
- "template-hda-downloadLinks",
- "template-hda-tagArea",
- "template-hda-annotationArea",
- "template-hda-displayApps",
-
- "template-user-quotaMeter-quota",
- "template-user-quotaMeter-usage"
-)}
-
-##TODO: fix: curr hasta be _after_ h.templates bc these use those templates - move somehow
-${h.js(
- "mvc/user/user-model", "mvc/user/user-quotameter",
- "mvc/dataset/hda-model", "mvc/dataset/hda-base", "mvc/dataset/hda-edit",
- "mvc/history/history-model", "mvc/history/history-panel"
-)}
-
-<script type="text/javascript">
-function galaxyPageSetUp(){
- // moving global functions, objects into Galaxy namespace
- top.Galaxy = top.Galaxy || {};
-
- // bad idea from memleak standpoint?
- top.Galaxy.mainWindow = top.Galaxy.mainWindow || top.frames.galaxy_main;
- top.Galaxy.toolWindow = top.Galaxy.toolWindow || top.frames.galaxy_tools;
- top.Galaxy.historyWindow = top.Galaxy.historyWindow || top.frames.galaxy_history;
-
- top.Galaxy.$masthead = top.Galaxy.$masthead || $( top.document ).find( 'div#masthead' );
- top.Galaxy.$messagebox = top.Galaxy.$messagebox || $( top.document ).find( 'div#messagebox' );
- top.Galaxy.$leftPanel = top.Galaxy.$leftPanel || $( top.document ).find( 'div#left' );
- top.Galaxy.$centerPanel = top.Galaxy.$centerPanel || $( top.document ).find( 'div#center' );
- top.Galaxy.$rightPanel = top.Galaxy.$rightPanel || $( top.document ).find( 'div#right' );
-
- //modals
- top.Galaxy.show_modal = top.show_modal;
- top.Galaxy.hide_modal = top.hide_modal;
-
- // other base functions
-
- // global backbone models
- top.Galaxy.currUser = top.Galaxy.currUser;
- top.Galaxy.currHistoryPanel = top.Galaxy.currHistoryPanel;
-
- //top.Galaxy.paths = galaxy_paths;
-
- top.Galaxy.localization = GalaxyLocalization;
- window.Galaxy = top.Galaxy;
-}
-
-// set js localizable strings
-GalaxyLocalization.setLocalizedString( ${ create_localization_json( get_page_localized_strings() ) } );
-
-// add needed controller urls to GalaxyPaths
-if( !galaxy_paths ){ galaxy_paths = top.galaxy_paths || new GalaxyPaths(); }
-galaxy_paths.set( 'hda', ${get_hda_url_templates()} );
-galaxy_paths.set( 'history', ${get_history_url_templates()} );
-
-$(function(){
- galaxyPageSetUp();
-
- //NOTE: for debugging on non-local instances (main/test)
- // 1. load history panel in own tab
- // 2. from console: new PersistantStorage( '__history_panel' ).set( 'debugging', true )
- // -> history panel and hdas will display console logs in console
- var debugging = false;
- if( jQuery.jStorage.get( '__history_panel' ) ){
- debugging = new PersistantStorage( '__history_panel' ).get( 'debugging' );
- }
-
- // ostensibly, this is the App
- // LOAD INITIAL DATA IN THIS PAGE - since we're already sending it...
- // ...use mako to 'bootstrap' the models
- var page_show_deleted = ${ 'true' if show_deleted == True else ( 'null' if show_deleted == None else 'false' ) },
- page_show_hidden = ${ 'true' if show_hidden == True else ( 'null' if show_hidden == None else 'false' ) },
-
- user = ${ get_current_user() },
- history = ${ get_history_json( history_dictionary ) },
- hdas = ${ get_hda_json( hda_dictionaries ) };
-
- // add user data to history
- // i don't like this history+user relationship, but user authentication changes views/behaviour
- history.user = user;
-
- // create the history panel
- var historyPanel = new HistoryPanel({
- model : new History( history, hdas ),
- urlTemplates : galaxy_paths.attributes,
- logger : ( debugging )?( console ):( null ),
- // is page sending in show settings? if so override history's
- show_deleted : page_show_deleted,
- show_hidden : page_show_hidden
- });
- historyPanel.render();
-
- // set it up to be accessible across iframes
- //TODO:?? mem leak
- top.Galaxy.currHistoryPanel = historyPanel;
- var currUser = new User( user );
- if( !Galaxy.currUser ){ Galaxy.currUser = currUser; }
-
- // QUOTA METER is a cross-frame ui element (meter in masthead, over quota message in history)
- // create it and join them here for now (via events)
- //TODO: this really belongs in the masthead
- //TODO: and the quota message (curr. in the history panel) belongs somewhere else
-
- //window.currUser.logger = console;
- var quotaMeter = new UserQuotaMeter({
- model : currUser,
- //logger : ( debugging )?( console ):( null ),
- el : $( top.document ).find( '.quota-meter-container' )
- });
- //quotaMeter.logger = console; window.quotaMeter = quotaMeter
- quotaMeter.render();
-
- // show/hide the 'over quota message' in the history when the meter tells it to
- quotaMeter.bind( 'quota:over', historyPanel.showQuotaMessage, historyPanel );
- quotaMeter.bind( 'quota:under', historyPanel.hideQuotaMessage, historyPanel );
- // having to add this to handle re-render of hview while overquota (the above do not fire)
- historyPanel.on( 'rendered rendered:initial', function(){
- if( quotaMeter.isOverQuota() ){
- historyPanel.showQuotaMessage();
- }
- });
- //TODO: this _is_ sent to the page (over_quota)...
-
- // update the quota meter when current history changes size
- historyPanel.model.bind( 'change:nice_size', function(){
- quotaMeter.update()
- }, quotaMeter );
-
-
- //ANOTHER cross-frame element is the history-options-button...
- // in this case, we need to change the popupmenu options listed to include some functions for this history
- // these include: current (1 & 2) 'show/hide' delete and hidden functions, and (3) the collapse all option
- (function(){
- // don't try this if the history panel is in it's own window
- if( top.document === window.document ){
- return;
- }
-
- // lots of wtf here...due to infernalframes
- //TODO: this is way tooo acrobatic
- var $historyButtonWindow = $( top.document ),
- HISTORY_MENU_BUTTON_ID = 'history-options-button',
- $historyMenuButton = $historyButtonWindow.find( '#' + HISTORY_MENU_BUTTON_ID ),
- // jq data in another frame can only be accessed by the jQuery in that frame,
- // get the jQuery from the top frame (that contains the history-options-button)
- START_INSERTING_AT_INDEX = 11,
- COLLAPSE_OPTION_TEXT = _l("Collapse Expanded Datasets"),
- DELETED_OPTION_TEXT = _l("Include Deleted Datasets"),
- HIDDEN_OPTION_TEXT = _l("Include Hidden Datasets");
- windowJQ = $( top )[0].jQuery,
- popupMenu = ( windowJQ && $historyMenuButton[0] )?( windowJQ.data( $historyMenuButton[0], 'PopupMenu' ) )
- :( null );
- //console.debug(
- // '$historyButtonWindow:', $historyButtonWindow,
- // '$historyMenuButton:', $historyMenuButton,
- // 'windowJQ:', windowJQ,
- // 'popupmenu:', popupMenu
- //);
- if( !popupMenu ){ return; }
-
- // since the history frame reloads so often (compared to the main window),
- // we need to check whether these options are there already before we add them again
- // In IE, however, NOT re-adding them creates a 'cant execute from freed script' error:
- // so...we need to re-add the function in either case (just not the option itself)
- //NOTE: we use the global Galaxy.currHistoryPanel here
- // because these remain bound in the main window even if panel refreshes
- //TODO: too much boilerplate
- //TODO: ugh...(in general)
- var collapseOption = popupMenu.findItemByHtml( COLLAPSE_OPTION_TEXT );
- if( !collapseOption ){
- collapseOption = {
- html : COLLAPSE_OPTION_TEXT
- };
- popupMenu.addItem( collapseOption, START_INSERTING_AT_INDEX )
- }
- collapseOption.func = function() {
- Galaxy.currHistoryPanel.collapseAllHdaBodies();
- };
-
- var deletedOption = popupMenu.findItemByHtml( DELETED_OPTION_TEXT );
- if( !deletedOption ){
- deletedOption = {
- html : DELETED_OPTION_TEXT
- };
- popupMenu.addItem( deletedOption, START_INSERTING_AT_INDEX + 1 )
- }
- deletedOption.func = function( clickEvent, thisMenuOption ){
- var show_deleted = Galaxy.currHistoryPanel.toggleShowDeleted();
- thisMenuOption.checked = show_deleted;
- };
- // whether was there or added, update the checked option to reflect the panel's settings on the panel render
- deletedOption.checked = Galaxy.currHistoryPanel.storage.get( 'show_deleted' );
-
- var hiddenOption = popupMenu.findItemByHtml( HIDDEN_OPTION_TEXT );
- if( !hiddenOption ){
- hiddenOption = {
- html : HIDDEN_OPTION_TEXT
- };
- popupMenu.addItem( hiddenOption, START_INSERTING_AT_INDEX + 2 )
- }
- hiddenOption.func = function( clickEvent, thisMenuOption ){
- var show_hidden = Galaxy.currHistoryPanel.toggleShowHidden();
- thisMenuOption.checked = show_hidden;
- };
- // whether was there or added, update the checked option to reflect the panel's settings on the panel render
- hiddenOption.checked = Galaxy.currHistoryPanel.storage.get( 'show_hidden' );
- })();
-
- //TODO: both the quota meter and the options-menu stuff need to be moved out when iframes are removed
-
- return;
-});
-</script>
-
-</%def>
-
-<%def name="stylesheets()">
- ${parent.stylesheets()}
- ${h.css(
- "base",
- "history",
- "autocomplete_tagging"
- )}
- <style>
- ## TODO: move to base.less
- .historyItemBody {
- display: none;
- }
-
- #history-controls {
- /*border: 1px solid white;*/
- margin-bottom: 5px;
- padding: 5px;
- }
-
- #history-title-area {
- margin: 0px 0px 5px 0px;
- /*border: 1px solid red;*/
- }
- #history-name {
- word-wrap: break-word;
- font-weight: bold;
- /*color: gray;*/
- }
- .editable-text {
- border: solid transparent 1px;
- }
- #history-name-container input {
- width: 90%;
- margin: -2px 0px -3px -4px;
- font-weight: bold;
- /*color: gray;*/
- }
-
- #quota-message-container {
- margin: 8px 0px 5px 0px;
- }
- #quota-message {
- margin: 0px;
- }
-
- #history-subtitle-area {
- /*border: 1px solid green;*/
- }
- #history-size {
- }
- #history-secondary-links {
- }
-
- /*why this is getting underlined is beyond me*/
- #history-secondary-links #history-refresh {
- text-decoration: none;
- }
- /*too tweaky*/
- #history-annotate {
- margin-right: 3px;
- }
-
- #history-tag-area, #history-annotation-area {
- margin: 10px 0px 10px 0px;
- }
-
- .historyItemTitle {
- text-decoration: underline;
- cursor: pointer;
- }
- .historyItemTitle:hover {
- text-decoration: underline;
- }
-
- </style>
-</%def>
-
-<body class="historyPage"></body>
diff -r b7a2605bd0a3c8452999b645eb3a37d876c4f2da -r 24c143157b7acad82501bf8655b587034963e4b2 templates/webapps/galaxy/root/history.mako
--- /dev/null
+++ b/templates/webapps/galaxy/root/history.mako
@@ -0,0 +1,522 @@
+<%inherit file="/base.mako"/>
+
+<%def name="title()">
+ ${_('Galaxy History')}
+</%def>
+
+## ---------------------------------------------------------------------------------------------------------------------
+<%def name="create_localization_json( strings_to_localize )">
+ ## converts strings_to_localize (a list of strings) into a JSON dictionary of { string : localized string }
+${ h.to_json_string( dict([ ( string, _(string) ) for string in strings_to_localize ]) ) }
+## ?? add: if string != _(string)
+</%def>
+
+<%def name="get_page_localized_strings()">
+ ## a list of localized strings used in the backbone views, etc. (to be loaded and cached)
+ ##! change on per page basis
+ <%
+ ## havent been localized
+ ##[
+ ## "anonymous user",
+ ## "Click to rename history",
+ ## "Click to see more actions",
+ ## "Edit history tags",
+ ## "Edit history annotation",
+ ## "Tags",
+ ## "Annotation",
+ ## "Click to edit annotation",
+ ## "You are over your disk ...w your allocated quota.",
+ ## "Show deleted",
+ ## "Show hidden",
+ ## "View data",
+ ## "Edit Attributes",
+ ## "Download",
+ ## "View details",
+ ## "Run this job again",
+ ## "Visualize",
+ ## "Edit dataset tags",
+ ## "Edit dataset annotation",
+ ## "Trackster",
+ ## "Circster",
+ ## "Scatterplot",
+ ## "GeneTrack",
+ ## "Local",
+ ## "Web",
+ ## "Current",
+ ## "main",
+ ## "Using"
+ ##]
+ strings_to_localize = [
+
+ # from history.mako
+ # not needed?: "Galaxy History",
+ 'refresh',
+ 'collapse all',
+ 'hide deleted',
+ 'hide hidden',
+ 'You are currently viewing a deleted history!',
+ "Your history is empty. Click 'Get Data' on the left pane to start",
+
+ # from history_common.mako
+ 'Download',
+ 'Display Data',
+ 'View data',
+ 'Edit attributes',
+ 'Delete',
+ 'Job is waiting to run',
+ 'View Details',
+ 'Run this job again',
+ 'Job is currently running',
+ 'View Details',
+ 'Run this job again',
+ 'Metadata is being Auto-Detected.',
+ 'No data: ',
+ 'format: ',
+ 'database: ',
+ #TODO localized data.dbkey??
+ 'Info: ',
+ #TODO localized display_app.display_name??
+ # _( link_app.name )
+ # localized peek...ugh
+ 'Error: unknown dataset state',
+ ]
+ return strings_to_localize
+ %>
+</%def>
+
+## ---------------------------------------------------------------------------------------------------------------------
+## all the possible history urls (primarily from web controllers at this point)
+<%def name="get_history_url_templates()">
+<%
+ from urllib import unquote_plus
+
+ history_class_name = 'History'
+ encoded_id_template = '<%= id %>'
+
+ url_dict = {
+ 'rename' : h.url_for( controller="history", action="rename_async",
+ id=encoded_id_template ),
+ 'tag' : h.url_for( controller='tag', action='get_tagging_elt_async',
+ item_class=history_class_name, item_id=encoded_id_template ),
+ 'annotate' : h.url_for( controller="history", action="annotate_async",
+ id=encoded_id_template )
+ }
+%>
+${ unquote_plus( h.to_json_string( url_dict ) ) }
+</%def>
+
+## ---------------------------------------------------------------------------------------------------------------------
+## all the possible hda urls (primarily from web controllers at this point) - whether they should have them or not
+##TODO: unify url_for btwn web, api
+<%def name="get_hda_url_templates()">
+<%
+ from urllib import unquote_plus
+
+ hda_class_name = 'HistoryDatasetAssociation'
+ encoded_id_template = '<%= id %>'
+
+ hda_ext_template = '<%= file_ext %>'
+ meta_type_template = '<%= file_type %>'
+
+ display_app_name_template = '<%= name %>'
+ display_app_link_template = '<%= link %>'
+
+ url_dict = {
+ # ................................................................ warning message links
+ 'purge' : h.url_for( controller='dataset', action='purge_async',
+ dataset_id=encoded_id_template ),
+ #TODO: hide (via api)
+ 'unhide' : h.url_for( controller='dataset', action='unhide',
+ dataset_id=encoded_id_template ),
+ #TODO: via api
+ 'undelete' : h.url_for( controller='dataset', action='undelete',
+ dataset_id=encoded_id_template ),
+
+ # ................................................................ title actions (display, edit, delete),
+ 'display' : h.url_for( controller='dataset', action='display',
+ dataset_id=encoded_id_template, preview=True, filename='' ),
+ 'edit' : h.url_for( controller='dataset', action='edit',
+ dataset_id=encoded_id_template ),
+
+ #TODO: via api
+ 'delete' : h.url_for( controller='dataset', action='delete_async', dataset_id=encoded_id_template ),
+
+ # ................................................................ download links (and associated meta files),
+ 'download' : h.url_for( controller='dataset', action='display',
+ dataset_id=encoded_id_template, to_ext=hda_ext_template ),
+ 'meta_download' : h.url_for( controller='dataset', action='get_metadata_file',
+ hda_id=encoded_id_template, metadata_name=meta_type_template ),
+
+ # ................................................................ primary actions (errors, params, rerun),
+ 'report_error' : h.url_for( controller='dataset', action='errors',
+ id=encoded_id_template ),
+ 'show_params' : h.url_for( controller='dataset', action='show_params',
+ dataset_id=encoded_id_template ),
+ 'rerun' : h.url_for( controller='tool_runner', action='rerun',
+ id=encoded_id_template ),
+ 'visualization' : h.url_for( controller='visualization', action='index' ),
+
+ # ................................................................ secondary actions (tagging, annotation),
+ 'tags' : {
+ 'get' : h.url_for( controller='tag', action='get_tagging_elt_async',
+ item_class=hda_class_name, item_id=encoded_id_template ),
+ 'set' : h.url_for( controller='tag', action='retag',
+ item_class=hda_class_name, item_id=encoded_id_template ),
+ },
+ 'annotation' : {
+ 'get' : h.url_for( controller='dataset', action='get_annotation_async',
+ id=encoded_id_template ),
+ 'set' : h.url_for( controller='/dataset', action='annotate_async',
+ id=encoded_id_template ),
+ },
+ }
+%>
+${ unquote_plus( h.to_json_string( url_dict ) ) }
+</%def>
+
+## -----------------------------------------------------------------------------
+<%def name="get_history_json( history )">
+<%
+ try:
+ return h.to_json_string( history )
+ except TypeError, type_err:
+ log.error( 'Could not serialize history' )
+ log.debug( 'history data: %s', str( history ) )
+ return '{}'
+%>
+</%def>
+
+<%def name="get_current_user()">
+<%
+ user_json = trans.webapp.api_controllers[ 'users' ].show( trans, 'current' )
+ return user_json
+%>
+</%def>
+
+<%def name="get_hda_json( hdas )">
+<%
+ try:
+ return h.to_json_string( hdas )
+ except TypeError, type_err:
+ log.error( 'Could not serialize hdas for history: %s', history['id'] )
+ log.debug( 'hda data: %s', str( hdas ) )
+ return '{}'
+%>
+</%def>
+
+
+## -----------------------------------------------------------------------------
+<%def name="javascripts()">
+${parent.javascripts()}
+
+${h.js(
+ "libs/jquery/jstorage",
+ "libs/jquery/jquery.autocomplete", "galaxy.autocom_tagging",
+ "mvc/base-mvc",
+)}
+
+${h.templates(
+ "helpers-common-templates",
+ "template-warningmessagesmall",
+
+ "template-history-historyPanel",
+
+ "template-hda-warning-messages",
+ "template-hda-titleLink",
+ "template-hda-failedMetadata",
+ "template-hda-hdaSummary",
+ "template-hda-downloadLinks",
+ "template-hda-tagArea",
+ "template-hda-annotationArea",
+ "template-hda-displayApps",
+
+ "template-user-quotaMeter-quota",
+ "template-user-quotaMeter-usage"
+)}
+
+##TODO: fix: curr hasta be _after_ h.templates bc these use those templates - move somehow
+${h.js(
+ "mvc/user/user-model", "mvc/user/user-quotameter",
+ "mvc/dataset/hda-model", "mvc/dataset/hda-base", "mvc/dataset/hda-edit",
+ "mvc/history/history-model", "mvc/history/history-panel"
+)}
+
+<script type="text/javascript">
+function galaxyPageSetUp(){
+ // moving global functions, objects into Galaxy namespace
+ top.Galaxy = top.Galaxy || {};
+
+ // bad idea from memleak standpoint?
+ top.Galaxy.mainWindow = top.Galaxy.mainWindow || top.frames.galaxy_main;
+ top.Galaxy.toolWindow = top.Galaxy.toolWindow || top.frames.galaxy_tools;
+ top.Galaxy.historyWindow = top.Galaxy.historyWindow || top.frames.galaxy_history;
+
+ top.Galaxy.$masthead = top.Galaxy.$masthead || $( top.document ).find( 'div#masthead' );
+ top.Galaxy.$messagebox = top.Galaxy.$messagebox || $( top.document ).find( 'div#messagebox' );
+ top.Galaxy.$leftPanel = top.Galaxy.$leftPanel || $( top.document ).find( 'div#left' );
+ top.Galaxy.$centerPanel = top.Galaxy.$centerPanel || $( top.document ).find( 'div#center' );
+ top.Galaxy.$rightPanel = top.Galaxy.$rightPanel || $( top.document ).find( 'div#right' );
+
+ //modals
+ top.Galaxy.show_modal = top.show_modal;
+ top.Galaxy.hide_modal = top.hide_modal;
+
+ // other base functions
+
+ // global backbone models
+ top.Galaxy.currUser = top.Galaxy.currUser;
+ top.Galaxy.currHistoryPanel = top.Galaxy.currHistoryPanel;
+
+ //top.Galaxy.paths = galaxy_paths;
+
+ top.Galaxy.localization = GalaxyLocalization;
+ window.Galaxy = top.Galaxy;
+}
+
+// set js localizable strings
+GalaxyLocalization.setLocalizedString( ${ create_localization_json( get_page_localized_strings() ) } );
+
+// add needed controller urls to GalaxyPaths
+if( !galaxy_paths ){ galaxy_paths = top.galaxy_paths || new GalaxyPaths(); }
+galaxy_paths.set( 'hda', ${get_hda_url_templates()} );
+galaxy_paths.set( 'history', ${get_history_url_templates()} );
+
+$(function(){
+ galaxyPageSetUp();
+
+ //NOTE: for debugging on non-local instances (main/test)
+ // 1. load history panel in own tab
+ // 2. from console: new PersistantStorage( '__history_panel' ).set( 'debugging', true )
+ // -> history panel and hdas will display console logs in console
+ var debugging = false;
+ if( jQuery.jStorage.get( '__history_panel' ) ){
+ debugging = new PersistantStorage( '__history_panel' ).get( 'debugging' );
+ }
+
+ // ostensibly, this is the App
+ // LOAD INITIAL DATA IN THIS PAGE - since we're already sending it...
+ // ...use mako to 'bootstrap' the models
+ var page_show_deleted = ${ 'true' if show_deleted == True else ( 'null' if show_deleted == None else 'false' ) },
+ page_show_hidden = ${ 'true' if show_hidden == True else ( 'null' if show_hidden == None else 'false' ) },
+
+ user = ${ get_current_user() },
+ history = ${ get_history_json( history_dictionary ) },
+ hdas = ${ get_hda_json( hda_dictionaries ) };
+
+ // add user data to history
+ // i don't like this history+user relationship, but user authentication changes views/behaviour
+ history.user = user;
+
+ // create the history panel
+ var historyPanel = new HistoryPanel({
+ model : new History( history, hdas ),
+ urlTemplates : galaxy_paths.attributes,
+ logger : ( debugging )?( console ):( null ),
+ // is page sending in show settings? if so override history's
+ show_deleted : page_show_deleted,
+ show_hidden : page_show_hidden
+ });
+ historyPanel.render();
+
+ // set it up to be accessible across iframes
+ //TODO:?? mem leak
+ top.Galaxy.currHistoryPanel = historyPanel;
+ var currUser = new User( user );
+ if( !Galaxy.currUser ){ Galaxy.currUser = currUser; }
+
+ // QUOTA METER is a cross-frame ui element (meter in masthead, over quota message in history)
+ // create it and join them here for now (via events)
+ //TODO: this really belongs in the masthead
+ //TODO: and the quota message (curr. in the history panel) belongs somewhere else
+
+ //window.currUser.logger = console;
+ var quotaMeter = new UserQuotaMeter({
+ model : currUser,
+ //logger : ( debugging )?( console ):( null ),
+ el : $( top.document ).find( '.quota-meter-container' )
+ });
+ //quotaMeter.logger = console; window.quotaMeter = quotaMeter
+ quotaMeter.render();
+
+ // show/hide the 'over quota message' in the history when the meter tells it to
+ quotaMeter.bind( 'quota:over', historyPanel.showQuotaMessage, historyPanel );
+ quotaMeter.bind( 'quota:under', historyPanel.hideQuotaMessage, historyPanel );
+ // having to add this to handle re-render of hview while overquota (the above do not fire)
+ historyPanel.on( 'rendered rendered:initial', function(){
+ if( quotaMeter.isOverQuota() ){
+ historyPanel.showQuotaMessage();
+ }
+ });
+ //TODO: this _is_ sent to the page (over_quota)...
+
+ // update the quota meter when current history changes size
+ historyPanel.model.bind( 'change:nice_size', function(){
+ quotaMeter.update()
+ }, quotaMeter );
+
+
+ //ANOTHER cross-frame element is the history-options-button...
+ // in this case, we need to change the popupmenu options listed to include some functions for this history
+ // these include: current (1 & 2) 'show/hide' delete and hidden functions, and (3) the collapse all option
+ (function(){
+ // don't try this if the history panel is in it's own window
+ if( top.document === window.document ){
+ return;
+ }
+
+ // lots of wtf here...due to infernalframes
+ //TODO: this is way tooo acrobatic
+ var $historyButtonWindow = $( top.document ),
+ HISTORY_MENU_BUTTON_ID = 'history-options-button',
+ $historyMenuButton = $historyButtonWindow.find( '#' + HISTORY_MENU_BUTTON_ID ),
+ // jq data in another frame can only be accessed by the jQuery in that frame,
+ // get the jQuery from the top frame (that contains the history-options-button)
+ START_INSERTING_AT_INDEX = 11,
+ COLLAPSE_OPTION_TEXT = _l("Collapse Expanded Datasets"),
+ DELETED_OPTION_TEXT = _l("Include Deleted Datasets"),
+ HIDDEN_OPTION_TEXT = _l("Include Hidden Datasets");
+ windowJQ = $( top )[0].jQuery,
+ popupMenu = ( windowJQ && $historyMenuButton[0] )?( windowJQ.data( $historyMenuButton[0], 'PopupMenu' ) )
+ :( null );
+ //console.debug(
+ // '$historyButtonWindow:', $historyButtonWindow,
+ // '$historyMenuButton:', $historyMenuButton,
+ // 'windowJQ:', windowJQ,
+ // 'popupmenu:', popupMenu
+ //);
+ if( !popupMenu ){ return; }
+
+ // since the history frame reloads so often (compared to the main window),
+ // we need to check whether these options are there already before we add them again
+ // In IE, however, NOT re-adding them creates a 'cant execute from freed script' error:
+ // so...we need to re-add the function in either case (just not the option itself)
+ //NOTE: we use the global Galaxy.currHistoryPanel here
+ // because these remain bound in the main window even if panel refreshes
+ //TODO: too much boilerplate
+ //TODO: ugh...(in general)
+ var collapseOption = popupMenu.findItemByHtml( COLLAPSE_OPTION_TEXT );
+ if( !collapseOption ){
+ collapseOption = {
+ html : COLLAPSE_OPTION_TEXT
+ };
+ popupMenu.addItem( collapseOption, START_INSERTING_AT_INDEX )
+ }
+ collapseOption.func = function() {
+ Galaxy.currHistoryPanel.collapseAllHdaBodies();
+ };
+
+ var deletedOption = popupMenu.findItemByHtml( DELETED_OPTION_TEXT );
+ if( !deletedOption ){
+ deletedOption = {
+ html : DELETED_OPTION_TEXT
+ };
+ popupMenu.addItem( deletedOption, START_INSERTING_AT_INDEX + 1 )
+ }
+ deletedOption.func = function( clickEvent, thisMenuOption ){
+ var show_deleted = Galaxy.currHistoryPanel.toggleShowDeleted();
+ thisMenuOption.checked = show_deleted;
+ };
+ // whether was there or added, update the checked option to reflect the panel's settings on the panel render
+ deletedOption.checked = Galaxy.currHistoryPanel.storage.get( 'show_deleted' );
+
+ var hiddenOption = popupMenu.findItemByHtml( HIDDEN_OPTION_TEXT );
+ if( !hiddenOption ){
+ hiddenOption = {
+ html : HIDDEN_OPTION_TEXT
+ };
+ popupMenu.addItem( hiddenOption, START_INSERTING_AT_INDEX + 2 )
+ }
+ hiddenOption.func = function( clickEvent, thisMenuOption ){
+ var show_hidden = Galaxy.currHistoryPanel.toggleShowHidden();
+ thisMenuOption.checked = show_hidden;
+ };
+ // whether was there or added, update the checked option to reflect the panel's settings on the panel render
+ hiddenOption.checked = Galaxy.currHistoryPanel.storage.get( 'show_hidden' );
+ })();
+
+ //TODO: both the quota meter and the options-menu stuff need to be moved out when iframes are removed
+
+ return;
+});
+</script>
+
+</%def>
+
+<%def name="stylesheets()">
+ ${parent.stylesheets()}
+ ${h.css(
+ "base",
+ "history",
+ "autocomplete_tagging"
+ )}
+ <style>
+ ## TODO: move to base.less
+ .historyItemBody {
+ display: none;
+ }
+
+ #history-controls {
+ /*border: 1px solid white;*/
+ margin-bottom: 5px;
+ padding: 5px;
+ }
+
+ #history-title-area {
+ margin: 0px 0px 5px 0px;
+ /*border: 1px solid red;*/
+ }
+ #history-name {
+ word-wrap: break-word;
+ font-weight: bold;
+ /*color: gray;*/
+ }
+ .editable-text {
+ border: solid transparent 1px;
+ }
+ #history-name-container input {
+ width: 90%;
+ margin: -2px 0px -3px -4px;
+ font-weight: bold;
+ /*color: gray;*/
+ }
+
+ #quota-message-container {
+ margin: 8px 0px 5px 0px;
+ }
+ #quota-message {
+ margin: 0px;
+ }
+
+ #history-subtitle-area {
+ /*border: 1px solid green;*/
+ }
+ #history-size {
+ }
+ #history-secondary-links {
+ }
+
+ /*why this is getting underlined is beyond me*/
+ #history-secondary-links #history-refresh {
+ text-decoration: none;
+ }
+ /*too tweaky*/
+ #history-annotate {
+ margin-right: 3px;
+ }
+
+ #history-tag-area, #history-annotation-area {
+ margin: 10px 0px 10px 0px;
+ }
+
+ .historyItemTitle {
+ text-decoration: underline;
+ cursor: pointer;
+ }
+ .historyItemTitle:hover {
+ text-decoration: underline;
+ }
+
+ </style>
+</%def>
+
+<body class="historyPage"></body>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/b2a5169daea4/
Changeset: b2a5169daea4
Branch: error-message-fix
User: BjoernGruening
Date: 2013-04-06 15:00:37
Summary: change error message
Affected #: 1 file
diff -r cb25513c63cd7aa2ebd472e91109c96276ed6d9d -r b2a5169daea4ddcde4eec0b70b1d97a6fd1f1321 lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
--- a/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
+++ b/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
@@ -141,7 +141,7 @@
response.close()
except Exception, e:
message = "Error attempting to retrieve installation information from tool shed %s for revision %s of repository %s owned by %s: %s" % \
- ( str( tool_shed_url ), str( name ), str( owner ), str( changeset_revision ), str( e ) )
+ ( str( tool_shed_url ), str( changeset_revision ), str( name ), str( owner ), str( e ) )
log.error( message, exc_info=True )
trans.response.status = 500
return dict( status='error', error=message )
@@ -347,4 +347,4 @@
elif isinstance( installed_tool_shed_repositories, list ):
all_installed_tool_shed_repositories.extend( installed_tool_shed_repositories )
return all_installed_tool_shed_repositories
-
\ No newline at end of file
+
https://bitbucket.org/galaxy/galaxy-central/commits/b7a2605bd0a3/
Changeset: b7a2605bd0a3
User: dannon
Date: 2013-04-09 15:56:12
Summary: Merged in BjoernGruening/galaxy-central-bgruening/error-message-fix (pull request #150)
change error message
Affected #: 1 file
diff -r a395caa2f36f2e4aa6c2daf24e4da3c9ede4b125 -r b7a2605bd0a3c8452999b645eb3a37d876c4f2da lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
--- a/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
+++ b/lib/galaxy/webapps/galaxy/api/tool_shed_repositories.py
@@ -141,7 +141,7 @@
response.close()
except Exception, e:
message = "Error attempting to retrieve installation information from tool shed %s for revision %s of repository %s owned by %s: %s" % \
- ( str( tool_shed_url ), str( name ), str( owner ), str( changeset_revision ), str( e ) )
+ ( str( tool_shed_url ), str( changeset_revision ), str( name ), str( owner ), str( e ) )
log.error( message, exc_info=True )
trans.response.status = 500
return dict( status='error', error=message )
@@ -347,4 +347,4 @@
elif isinstance( installed_tool_shed_repositories, list ):
all_installed_tool_shed_repositories.extend( installed_tool_shed_repositories )
return all_installed_tool_shed_repositories
-
\ No newline at end of file
+
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/d07c62f0067a/
Changeset: d07c62f0067a
User: dannon
Date: 2013-04-09 15:34:24
Summary: Fix bug from 8333 which prevented ToolDataTable loading .loc files from directories other than the default galaxy tool_data_path.
Affected #: 1 file
diff -r 8fc56b85e0a5353ffd6790685a4a7d8a84938409 -r d07c62f0067a4fc6cabab499d8c1191199c5fc8f lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -55,7 +55,7 @@
self.data_table_elem_names.append( table_elem_name )
if from_shed_config:
self.shed_data_table_elems.append( table_elem )
- table = tool_data_table_types[ type ]( table_elem, tool_data_path )
+ table = tool_data_table_types[ type ]( table_elem, tool_data_path, from_shed_config)
if table.name not in self.data_tables:
self.data_tables[ table.name ] = table
log.debug( "Loaded tool data table '%s'", table.name )
@@ -132,7 +132,7 @@
os.chmod( full_path, 0644 )
class ToolDataTable( object ):
- def __init__( self, config_element, tool_data_path ):
+ def __init__( self, config_element, tool_data_path, from_shed_config = False):
self.name = config_element.get( 'name' )
self.comment_char = config_element.get( 'comment_char' )
self.empty_field_value = config_element.get( 'empty_field_value', '' )
@@ -164,11 +164,11 @@
type_key = 'tabular'
- def __init__( self, config_element, tool_data_path ):
- super( TabularToolDataTable, self ).__init__( config_element, tool_data_path )
- self.configure_and_load( config_element, tool_data_path )
+ def __init__( self, config_element, tool_data_path, from_shed_config = False):
+ super( TabularToolDataTable, self ).__init__( config_element, tool_data_path, from_shed_config)
+ self.configure_and_load( config_element, tool_data_path, from_shed_config)
- def configure_and_load( self, config_element, tool_data_path ):
+ def configure_and_load( self, config_element, tool_data_path, from_shed_config = False):
"""
Configure and load table from an XML element.
"""
@@ -180,7 +180,9 @@
all_rows = []
for file_element in config_element.findall( 'file' ):
found = False
- if tool_data_path:
+ if tool_data_path and from_shed_config:
+ # Must identify with from_shed_config as well, because the
+ # regular galaxy app has and uses tool_data_path.
# We're loading a tool in the tool shed, so we cannot use the Galaxy tool-data
# directory which is hard-coded into the tool_data_table_conf.xml entries.
filepath = file_element.get( 'path' )
https://bitbucket.org/galaxy/galaxy-central/commits/a395caa2f36f/
Changeset: a395caa2f36f
User: dannon
Date: 2013-04-09 15:36:23
Summary: Import cleanup, space methods, strip trailing whitespace.
Affected #: 1 file
diff -r d07c62f0067a4fc6cabab499d8c1191199c5fc8f -r a395caa2f36f2e4aa6c2daf24e4da3c9ede4b125 lib/galaxy/tools/data/__init__.py
--- a/lib/galaxy/tools/data/__init__.py
+++ b/lib/galaxy/tools/data/__init__.py
@@ -3,36 +3,47 @@
used by tools, for example in the generation of dynamic options. Tables are
loaded and stored by names which tools use to refer to them. This allows
users to configure data tables for a local Galaxy instance without needing
-to modify the tool configurations.
+to modify the tool configurations.
"""
-import logging, sys, os, os.path, tempfile, shutil
+import logging
+import os
+import os.path
+import shutil
+import tempfile
+
from galaxy import util
log = logging.getLogger( __name__ )
+
class ToolDataTableManager( object ):
"""Manages a collection of tool data tables"""
+
def __init__( self, tool_data_path, config_filename=None ):
self.tool_data_path = tool_data_path
# This stores all defined data table entries from both the tool_data_table_conf.xml file and the shed_tool_data_table_conf.xml file
# at server startup. If tool shed repositories are installed that contain a valid file named tool_data_table_conf.xml.sample, entries
# from that file are inserted into this dict at the time of installation.
- self.data_tables = {}
+ self.data_tables = {}
# Store config elements for on-the-fly persistence to the defined shed_tool_data_table_config file name.
self.shed_data_table_elems = []
self.data_table_elem_names = []
if config_filename:
self.load_from_config_file( config_filename, self.tool_data_path, from_shed_config=False )
+
def __getitem__( self, key ):
return self.data_tables.__getitem__( key )
+
def __contains__( self, key ):
return self.data_tables.__contains__( key )
+
def get( self, name, default=None ):
try:
return self[ name ]
except KeyError:
return default
+
def load_from_config_file( self, config_filename, tool_data_path, from_shed_config=False ):
"""
This method is called under 3 conditions:
@@ -65,6 +76,7 @@
if table_row not in self.data_tables[ table.name ].data:
self.data_tables[ table.name ].data.append( table_row )
return table_elems
+
def add_new_entries_from_config_file( self, config_filename, tool_data_path, shed_tool_data_table_config, persist=False ):
"""
This method is called when a tool shed repository that includes a tool_data_table_conf.xml.sample file is being
@@ -118,6 +130,7 @@
# Persist Galaxy's version of the changed tool_data_table_conf.xml file.
self.to_xml_file( shed_tool_data_table_config )
return table_elems, error_message
+
def to_xml_file( self, shed_tool_data_table_config ):
"""Write the current in-memory version of the shed_tool_data_table_conf.xml file to disk."""
full_path = os.path.abspath( shed_tool_data_table_config )
@@ -130,8 +143,10 @@
os.close( fd )
shutil.move( filename, full_path )
os.chmod( full_path, 0644 )
-
+
+
class ToolDataTable( object ):
+
def __init__( self, config_element, tool_data_path, from_shed_config = False):
self.name = config_element.get( 'name' )
self.comment_char = config_element.get( 'comment_char' )
@@ -146,14 +161,16 @@
self.tool_data_file = None
self.tool_data_path = tool_data_path
self.missing_index_file = None
+
def get_empty_field_by_name( self, name ):
return self.empty_field_values.get( name, self.empty_field_value )
-
+
+
class TabularToolDataTable( ToolDataTable ):
"""
Data stored in a tabular / separated value format on disk, allows multiple
files to be merged but all must have the same column definitions::
-
+
<table type="tabular" name="test"><column name='...' index = '...' /><file path="..." />
@@ -161,9 +178,9 @@
</table>
"""
-
+
type_key = 'tabular'
-
+
def __init__( self, config_element, tool_data_path, from_shed_config = False):
super( TabularToolDataTable, self ).__init__( config_element, tool_data_path, from_shed_config)
self.configure_and_load( config_element, tool_data_path, from_shed_config)
@@ -224,8 +241,8 @@
with a name and index (as in dynamic options config), or a shorthand
comma separated list of names in order as the text of a 'column_names'
element.
-
- A column named 'value' is required.
+
+ A column named 'value' is required.
"""
self.columns = {}
if config_element.find( 'columns' ) is not None:
@@ -254,7 +271,7 @@
def parse_file_fields( self, reader ):
"""
Parse separated lines from file and return a list of tuples.
-
+
TODO: Allow named access to fields using the column names.
"""
separator_char = (lambda c: '<TAB>' if c == '\t' else c)(self.separator)
@@ -270,10 +287,10 @@
rval.append( fields )
else:
log.warn( "Line %i in tool data table '%s' is invalid (HINT: "
- "'%s' characters must be used to separate fields):\n%s"
+ "'%s' characters must be used to separate fields):\n%s"
% ( ( i + 1 ), self.name, separator_char, line ) )
return rval
-
+
def get_column_name_list( self ):
rval = []
for i in range( self.largest_index + 1 ):
@@ -286,7 +303,7 @@
if not found_column:
rval.append( None )
return rval
-
+
def get_entry( self, query_attr, query_val, return_attr, default=None ):
"""
Returns table entry associated with a col/val pair.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
13 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/5b6306be4e4c/
Changeset: 5b6306be4e4c
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Refactor tool loading and macro preprocessing stuff into its own module, implement unit tests for basic functionality.
Affected #: 2 files
diff -r 14ab08c0fbfe3e9735e94f4e399aeabc870fe4b2 -r 5b6306be4e4c7f20b6b4e473449cbe7a02ebe921 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -29,7 +29,6 @@
from mako.template import Template
from paste import httpexceptions
from sqlalchemy import and_
-from copy import deepcopy
from galaxy import jobs, model
from galaxy.datatypes.metadata import JobExternalOutputMetadataWrapper
@@ -58,6 +57,7 @@
from galaxy.web import url_for
from galaxy.web.form_builder import SelectField
from tool_shed.util import shed_util_common
+from .loader import load_tool
log = logging.getLogger( __name__ )
@@ -577,7 +577,7 @@
def load_tool( self, config_file, guid=None, **kwds ):
"""Load a single tool from the file named by `config_file` and return an instance of `Tool`."""
# Parse XML configuration file and get the root element
- tree = self._load_and_preprocess_tool_xml( config_file )
+ tree = load_tool( config_file )
root = tree.getroot()
# Allow specifying a different tool subclass to instantiate
if root.find( "type" ) is not None:
@@ -762,102 +762,6 @@
return rval
- def _load_and_preprocess_tool_xml(self, config_file):
- tree = parse_xml(config_file)
- root = tree.getroot()
- macros_el = root.find('macros')
- if not macros_el:
- return tree
- tool_dir = os.path.dirname(config_file)
- macros = self._load_macros(macros_el, tool_dir)
-
- self._expand_macros([root], macros)
- return tree
-
- def _expand_macros(self, elements, macros):
- for element in elements:
- # HACK for elementtree, newer implementations (etree/lxml) won't
- # require this parent_map data structure but elementtree does not
- # track parents or recongnize .find('..').
- parent_map = dict((c, p) for p in element.getiterator() for c in p)
- for expand_el in element.findall('.//expand'):
- macro_name = expand_el.get('macro')
- macro_def = deepcopy(macros[macro_name]) # deepcopy needed?
-
- yield_els = [yield_el for macro_def_el in macro_def for yield_el in macro_def_el.findall('.//yield')]
-
- expand_el_children = expand_el.getchildren()
- macro_def_parent_map = \
- dict((c, p) for macro_def_el in macro_def for p in macro_def_el.getiterator() for c in p)
-
- for yield_el in yield_els:
- self._xml_replace(yield_el, expand_el_children, macro_def_parent_map)
-
- # Recursively expand contained macros.
- self._expand_macros(macro_def, macros)
- self._xml_replace(expand_el, macro_def, parent_map)
-
- def _load_macros(self, macros_el, tool_dir):
- macros = {}
- # Import macros from external files.
- macros.update(self._load_imported_macros(macros_el, tool_dir))
- # Load all directly defined macros.
- macros.update(self._load_embedded_macros(macros_el, tool_dir))
- return macros
-
- def _load_embedded_macros(self, macros_el, tool_dir):
- macros = {}
-
- macro_els = []
- if macros_el:
- macro_els = macros_el.findall("macro")
- for macro in macro_els:
- macro_name = macro.get("name")
- macros[macro_name] = self._load_macro_def(macro)
-
- return macros
-
- def _load_imported_macros(self, macros_el, tool_dir):
- macros = {}
-
- macro_import_els = []
- if macros_el:
- macro_import_els = macros_el.findall("import")
- for macro_import_el in macro_import_els:
- raw_import_path = macro_import_el.text
- tool_relative_import_path = \
- os.path.basename(raw_import_path) # Sanitize this
- import_path = \
- os.path.join(tool_dir, tool_relative_import_path)
- file_macros = self._load_macro_file(import_path, tool_dir)
- macros.update(file_macros)
-
- return macros
-
- def _load_macro_file(self, path, tool_dir):
- tree = parse_xml(path)
- root = tree.getroot()
- return self._load_macros(root, tool_dir)
-
- def _load_macro_def(self, macro):
- return list(macro.getchildren())
-
- def _xml_replace(self, query, targets, parent_map):
- #parent_el = query.find('..') ## Something like this would be better with newer xml library
- parent_el = parent_map[query]
- matching_index = -1
- #for index, el in enumerate(parent_el.iter('.')): ## Something like this for newer implementation
- for index, el in enumerate(parent_el.getchildren()):
- if el == query:
- matching_index = index
- break
- assert matching_index >= 0
- current_index = matching_index
- for target in targets:
- current_index += 1
- parent_el.insert(current_index, deepcopy(target))
- parent_el.remove(query)
-
class ToolSection( object ):
"""
diff -r 14ab08c0fbfe3e9735e94f4e399aeabc870fe4b2 -r 5b6306be4e4c7f20b6b4e473449cbe7a02ebe921 lib/galaxy/tools/loader.py
--- /dev/null
+++ b/lib/galaxy/tools/loader.py
@@ -0,0 +1,223 @@
+from __future__ import with_statement
+
+from copy import deepcopy
+import os
+
+from galaxy.util import parse_xml
+
+
+def load_tool(path):
+ """
+ Loads tool from file system and preprocesses tool macros.
+ """
+ tree = parse_xml(path)
+ root = tree.getroot()
+ macros_el = root.find('macros')
+ if not macros_el:
+ return tree
+ tool_dir = os.path.dirname(path)
+ macros = _load_macros(macros_el, tool_dir)
+
+ _expand_macros([root], macros)
+ return tree
+
+
+def _expand_macros(elements, macros):
+ for element in elements:
+ # HACK for elementtree, newer implementations (etree/lxml) won't
+ # require this parent_map data structure but elementtree does not
+ # track parents or recongnize .find('..').
+ parent_map = dict((c, p) for p in element.getiterator() for c in p)
+ for expand_el in element.findall('.//expand'):
+ macro_name = expand_el.get('macro')
+ macro_def = deepcopy(macros[macro_name]) # deepcopy needed?
+
+ yield_els = [yield_el for macro_def_el in macro_def for yield_el in macro_def_el.findall('.//yield')]
+
+ expand_el_children = expand_el.getchildren()
+ macro_def_parent_map = \
+ dict((c, p) for macro_def_el in macro_def for p in macro_def_el.getiterator() for c in p)
+
+ for yield_el in yield_els:
+ _xml_replace(yield_el, expand_el_children, macro_def_parent_map)
+
+ # Recursively expand contained macros.
+ _expand_macros(macro_def, macros)
+ _xml_replace(expand_el, macro_def, parent_map)
+
+
+def _load_macros(macros_el, tool_dir):
+ macros = {}
+ # Import macros from external files.
+ macros.update(_load_imported_macros(macros_el, tool_dir))
+ # Load all directly defined macros.
+ macros.update(_load_embedded_macros(macros_el, tool_dir))
+ return macros
+
+
+def _load_embedded_macros(macros_el, tool_dir):
+ macros = {}
+
+ macro_els = []
+ if macros_el:
+ macro_els = macros_el.findall("macro")
+ for macro in macro_els:
+ macro_name = macro.get("name")
+ macros[macro_name] = _load_macro_def(macro)
+
+ return macros
+
+
+def _load_imported_macros(macros_el, tool_dir):
+ macros = {}
+
+ macro_import_els = []
+ if macros_el:
+ macro_import_els = macros_el.findall("import")
+ for macro_import_el in macro_import_els:
+ raw_import_path = macro_import_el.text
+ tool_relative_import_path = \
+ os.path.basename(raw_import_path) # Sanitize this
+ import_path = \
+ os.path.join(tool_dir, tool_relative_import_path)
+ file_macros = _load_macro_file(import_path, tool_dir)
+ macros.update(file_macros)
+
+ return macros
+
+
+def _load_macro_file(path, tool_dir):
+ tree = parse_xml(path)
+ root = tree.getroot()
+ return _load_macros(root, tool_dir)
+
+
+def _load_macro_def(macro):
+ return list(macro.getchildren())
+
+
+def _xml_replace(query, targets, parent_map):
+ #parent_el = query.find('..') ## Something like this would be better with newer xml library
+ parent_el = parent_map[query]
+ matching_index = -1
+ #for index, el in enumerate(parent_el.iter('.')): ## Something like this for newer implementation
+ for index, el in enumerate(parent_el.getchildren()):
+ if el == query:
+ matching_index = index
+ break
+ assert matching_index >= 0
+ current_index = matching_index
+ for target in targets:
+ current_index += 1
+ parent_el.insert(current_index, deepcopy(target))
+ parent_el.remove(query)
+
+
+def test_loader():
+ """
+ Function to test this module. Galaxy doesn't seem to have a
+ place to put unit tests that are not doctests. These tests can
+ be run with nosetests via the following command:
+
+ % nosetests --with-doctest lib/galaxy/tools/loader.py
+
+ """
+ from tempfile import mkdtemp
+ from shutil import rmtree
+
+ class TestToolDirectory(object):
+ def __init__(self):
+ self.temp_directory = mkdtemp()
+
+ def __enter__(self):
+ return self
+
+ def __exit__(self, type, value, tb):
+ rmtree(self.temp_directory)
+
+ def write(self, contents, name="tool.xml"):
+ open(os.path.join(self.temp_directory, name), "w").write(contents)
+
+ def load(self, name="tool.xml", preprocess=True):
+ if preprocess:
+ loader = load_tool
+ else:
+ loader = parse_xml
+ return loader(os.path.join(self.temp_directory, name))
+
+ ## Test simple macro replacement.
+ with TestToolDirectory() as tool_dir:
+ tool_dir.write('''
+<tool>
+ <expand macro="inputs" />
+ <macros>
+ <macro name="inputs">
+ <inputs />
+ </macro>
+ </macros>
+</tool>''')
+ xml = tool_dir.load(preprocess=False)
+ assert xml.find("inputs") is None
+ xml = tool_dir.load(preprocess=True)
+ assert xml.find("inputs") is not None
+
+ # Test importing macros from external files
+ with TestToolDirectory() as tool_dir:
+ tool_dir.write('''
+<tool>
+ <expand macro="inputs" />
+ <macros>
+ <import>external.xml</import>
+ </macros>
+</tool>''')
+
+ tool_dir.write('''
+<macros>
+ <macro name="inputs">
+ <inputs />
+ </macro>
+</macros>''', name="external.xml")
+ xml = tool_dir.load(preprocess=False)
+ assert xml.find("inputs") is None
+ xml = tool_dir.load(preprocess=True)
+ assert xml.find("inputs") is not None
+
+ # Test macros with unnamed yield statements.
+ with TestToolDirectory() as tool_dir:
+ tool_dir.write('''
+<tool>
+ <expand macro="inputs">
+ <input name="first_input" />
+ </expand>
+ <macros>
+ <macro name="inputs">
+ <inputs>
+ <yield />
+ </inputs>
+ </macro>
+ </macros>
+</tool>''')
+ xml = tool_dir.load()
+ assert xml.find("inputs").find("input").get("name") == "first_input"
+
+ # Test recursive macro applications.
+ with TestToolDirectory() as tool_dir:
+ tool_dir.write('''
+<tool>
+ <expand macro="inputs">
+ <input name="first_input" />
+ <expand macro="second" />
+ </expand>
+ <macros>
+ <macro name="inputs">
+ <inputs>
+ <yield />
+ </inputs>
+ </macro>
+ <macro name="second">
+ <input name="second_input" />
+ </macro>
+ </macros>
+</tool>''')
+ xml = tool_dir.load()
+ assert xml.find("inputs").findall("input")[1].get("name") == "second_input"
https://bitbucket.org/galaxy/galaxy-central/commits/8400b75d0a68/
Changeset: 8400b75d0a68
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Refactor macro stuff to allow multiple macro types (default type is xml, <xml> will be short for <macro type="xml"), improves ordering of interleaving imports and embedded macros.
Affected #: 1 file
diff -r 5b6306be4e4c7f20b6b4e473449cbe7a02ebe921 -r 8400b75d0a688ec4a6a07e2e5aa98ef9dcbb193c lib/galaxy/tools/loader.py
--- a/lib/galaxy/tools/loader.py
+++ b/lib/galaxy/tools/loader.py
@@ -13,12 +13,17 @@
tree = parse_xml(path)
root = tree.getroot()
macros_el = root.find('macros')
- if not macros_el:
- return tree
tool_dir = os.path.dirname(path)
- macros = _load_macros(macros_el, tool_dir)
- _expand_macros([root], macros)
+ if macros_el:
+ macro_els = _load_macros(macros_el, tool_dir)
+ _xml_set_children(macros_el, macro_els)
+
+ macro_dict = dict([(macro_el.get("name"), list(macro_el.getchildren())) \
+ for macro_el in macro_els \
+ if macro_el.get('type') == 'xml'])
+ _expand_macros([root], macro_dict)
+
return tree
@@ -30,6 +35,7 @@
parent_map = dict((c, p) for p in element.getiterator() for c in p)
for expand_el in element.findall('.//expand'):
macro_name = expand_el.get('macro')
+ print macros.keys()
macro_def = deepcopy(macros[macro_name]) # deepcopy needed?
yield_els = [yield_el for macro_def_el in macro_def for yield_el in macro_def_el.findall('.//yield')]
@@ -47,29 +53,43 @@
def _load_macros(macros_el, tool_dir):
- macros = {}
+ macros = []
# Import macros from external files.
- macros.update(_load_imported_macros(macros_el, tool_dir))
+ macros.extend(_load_imported_macros(macros_el, tool_dir))
# Load all directly defined macros.
- macros.update(_load_embedded_macros(macros_el, tool_dir))
+ macros.extend(_load_embedded_macros(macros_el, tool_dir))
return macros
def _load_embedded_macros(macros_el, tool_dir):
- macros = {}
+ macros = []
macro_els = []
+ # attribute typed macro
if macros_el:
macro_els = macros_el.findall("macro")
for macro in macro_els:
- macro_name = macro.get("name")
- macros[macro_name] = _load_macro_def(macro)
+ if 'type' not in macro.attrib:
+ macro.attrib['type'] = 'xml'
+ macros.append(macro)
+
+ # type shortcuts (<xml> is a shortcut for <macro type="xml",
+ # likewise for <template>.
+ typed_tag = ['xml']
+ for tag in typed_tag:
+ macro_els = []
+ if macros_el:
+ macro_els = macros_el.findall(tag)
+ for macro_el in macro_els:
+ macro_el.attrib['type'] = tag
+ macro_el.tag = 'macro'
+ macros.append(macro_el)
return macros
def _load_imported_macros(macros_el, tool_dir):
- macros = {}
+ macros = []
macro_import_els = []
if macros_el:
@@ -81,7 +101,7 @@
import_path = \
os.path.join(tool_dir, tool_relative_import_path)
file_macros = _load_macro_file(import_path, tool_dir)
- macros.update(file_macros)
+ macros.extend(file_macros)
return macros
@@ -96,6 +116,13 @@
return list(macro.getchildren())
+def _xml_set_children(element, new_children):
+ for old_child in element.getchildren():
+ element.remove(old_child)
+ for i, new_child in enumerate(new_children):
+ element.insert(i, new_child)
+
+
def _xml_replace(query, targets, parent_map):
#parent_el = query.find('..') ## Something like this would be better with newer xml library
parent_el = parent_map[query]
@@ -119,7 +146,7 @@
place to put unit tests that are not doctests. These tests can
be run with nosetests via the following command:
- % nosetests --with-doctest lib/galaxy/tools/loader.py
+ % nosetests lib/galaxy/tools/loader.py
"""
from tempfile import mkdtemp
@@ -221,3 +248,17 @@
</tool>''')
xml = tool_dir.load()
assert xml.find("inputs").findall("input")[1].get("name") == "second_input"
+
+ # Test <xml> is shortcut for macro type="xml"
+ with TestToolDirectory() as tool_dir:
+ tool_dir.write('''
+<tool>
+ <expand macro="inputs" />
+ <macros>
+ <xml name="inputs">
+ <inputs />
+ </xml>
+ </macros>
+</tool>''')
+ xml = tool_dir.load(preprocess=True)
+ assert xml.find("inputs") is not None
https://bitbucket.org/galaxy/galaxy-central/commits/96aae9b33613/
Changeset: 96aae9b33613
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Implement 'template' macros that define variables which are made available to cheetah templates.
Affected #: 2 files
diff -r 8400b75d0a688ec4a6a07e2e5aa98ef9dcbb193c -r 96aae9b336130398250524311b4b69c432a44966 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -57,7 +57,7 @@
from galaxy.web import url_for
from galaxy.web.form_builder import SelectField
from tool_shed.util import shed_util_common
-from .loader import load_tool
+from .loader import load_tool, template_macro_params
log = logging.getLogger( __name__ )
@@ -1246,6 +1246,7 @@
# thus hardcoded) FIXME: hidden parameters aren't
# parameters at all really, and should be passed in a different
# way, making this check easier.
+ self.template_macro_params = template_macro_params(root)
for param in self.inputs.values():
if not isinstance( param, ( HiddenToolParameter, BaseURLToolParameter ) ):
self.input_required = True
@@ -2367,7 +2368,7 @@
`to_param_dict_string` method of the associated input.
"""
param_dict = dict()
-
+ param_dict.update(self.template_macro_params)
# All parameters go into the param_dict
param_dict.update( incoming )
diff -r 8400b75d0a688ec4a6a07e2e5aa98ef9dcbb193c -r 96aae9b336130398250524311b4b69c432a44966 lib/galaxy/tools/loader.py
--- a/lib/galaxy/tools/loader.py
+++ b/lib/galaxy/tools/loader.py
@@ -12,22 +12,51 @@
"""
tree = parse_xml(path)
root = tree.getroot()
+
+ _import_macros(root, path)
+
+ # Expand xml macros
+ macro_dict = _macros_of_type(root, 'xml', lambda el: list(el.getchildren()))
+ _expand_macros([root], macro_dict)
+
+ return tree
+
+
+def template_macro_params(root):
+ """
+ Look for template macros and populate param_dict (for cheetah)
+ with these.
+ """
+ param_dict = {}
+ macro_dict = _macros_of_type(root, 'template', lambda el: el.text)
+ for key, value in macro_dict.iteritems():
+ param_dict[key] = value
+ return param_dict
+
+
+def _import_macros(root, path):
+ tool_dir = os.path.dirname(path)
macros_el = root.find('macros')
- tool_dir = os.path.dirname(path)
-
if macros_el:
macro_els = _load_macros(macros_el, tool_dir)
_xml_set_children(macros_el, macro_els)
- macro_dict = dict([(macro_el.get("name"), list(macro_el.getchildren())) \
+
+def _macros_of_type(root, type, el_func):
+ macros_el = root.find('macros')
+ macro_dict = {}
+ if macros_el:
+ macro_els = macros_el.findall('macro')
+ macro_dict = dict([(macro_el.get("name"), el_func(macro_el)) \
for macro_el in macro_els \
- if macro_el.get('type') == 'xml'])
- _expand_macros([root], macro_dict)
-
- return tree
+ if macro_el.get('type') == type])
+ return macro_dict
def _expand_macros(elements, macros):
+ if not macros:
+ return
+
for element in elements:
# HACK for elementtree, newer implementations (etree/lxml) won't
# require this parent_map data structure but elementtree does not
@@ -35,7 +64,6 @@
parent_map = dict((c, p) for p in element.getiterator() for c in p)
for expand_el in element.findall('.//expand'):
macro_name = expand_el.get('macro')
- print macros.keys()
macro_def = deepcopy(macros[macro_name]) # deepcopy needed?
yield_els = [yield_el for macro_def_el in macro_def for yield_el in macro_def_el.findall('.//yield')]
@@ -75,7 +103,7 @@
# type shortcuts (<xml> is a shortcut for <macro type="xml",
# likewise for <template>.
- typed_tag = ['xml']
+ typed_tag = ['template', 'xml']
for tag in typed_tag:
macro_els = []
if macros_el:
@@ -260,5 +288,20 @@
</xml></macros></tool>''')
- xml = tool_dir.load(preprocess=True)
+ xml = tool_dir.load()
assert xml.find("inputs") is not None
+
+ with TestToolDirectory() as tool_dir:
+ tool_dir.write('''
+<tool>
+ <command interpreter="python">tool_wrapper.py
+ #include source=$tool_params
+ </command>
+ <macros>
+ <template name="tool_params">-a 1 -b 2</template>
+ </macros>
+</tool>
+''')
+ xml = tool_dir.load()
+ params_dict = template_macro_params(xml.getroot())
+ assert params_dict['tool_params'] == "-a 1 -b 2"
https://bitbucket.org/galaxy/galaxy-central/commits/dd8e910f6d30/
Changeset: dd8e910f6d30
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Add GATK macros file and import in each wrapper.
Affected #: 17 files
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/analyze_covariates.xml
--- a/tools/gatk/analyze_covariates.xml
+++ b/tools/gatk/analyze_covariates.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/count_covariates.xml
--- a/tools/gatk/count_covariates.xml
+++ b/tools/gatk/count_covariates.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.4">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/depth_of_coverage.xml
--- a/tools/gatk/depth_of_coverage.xml
+++ b/tools/gatk/depth_of_coverage.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.4">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/gatk_macros.xml
--- /dev/null
+++ b/tools/gatk/gatk_macros.xml
@@ -0,0 +1,2 @@
+<macros>
+</macros>
\ No newline at end of file
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/indel_realigner.xml
--- a/tools/gatk/indel_realigner.xml
+++ b/tools/gatk/indel_realigner.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.4">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/print_reads.xml
--- a/tools/gatk/print_reads.xml
+++ b/tools/gatk/print_reads.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.4">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/realigner_target_creator.xml
--- a/tools/gatk/realigner_target_creator.xml
+++ b/tools/gatk/realigner_target_creator.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.3">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/table_recalibration.xml
--- a/tools/gatk/table_recalibration.xml
+++ b/tools/gatk/table_recalibration.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.4">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/unified_genotyper.xml
--- a/tools/gatk/unified_genotyper.xml
+++ b/tools/gatk/unified_genotyper.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.4">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variant_annotator.xml
--- a/tools/gatk/variant_annotator.xml
+++ b/tools/gatk/variant_annotator.xml
@@ -4,6 +4,9 @@
<requirement type="package" version="1.4">gatk</requirement><requirement type="package">samtools</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variant_apply_recalibration.xml
--- a/tools/gatk/variant_apply_recalibration.xml
+++ b/tools/gatk/variant_apply_recalibration.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variant_combine.xml
--- a/tools/gatk/variant_combine.xml
+++ b/tools/gatk/variant_combine.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variant_eval.xml
--- a/tools/gatk/variant_eval.xml
+++ b/tools/gatk/variant_eval.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
#from binascii import hexlify
--max_jvm_heap_fraction "1"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variant_filtration.xml
--- a/tools/gatk/variant_filtration.xml
+++ b/tools/gatk/variant_filtration.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
#from binascii import hexlify
--max_jvm_heap_fraction "1"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variant_recalibrator.xml
--- a/tools/gatk/variant_recalibrator.xml
+++ b/tools/gatk/variant_recalibrator.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variant_select.xml
--- a/tools/gatk/variant_select.xml
+++ b/tools/gatk/variant_select.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
#from binascii import hexlify
--max_jvm_heap_fraction "1"
diff -r 96aae9b336130398250524311b4b69c432a44966 -r dd8e910f6d30086bb8fcacdc400fe2ee2305d991 tools/gatk/variants_validate.xml
--- a/tools/gatk/variants_validate.xml
+++ b/tools/gatk/variants_validate.xml
@@ -3,6 +3,9 @@
<requirements><requirement type="package" version="1.4">gatk</requirement></requirements>
+ <macros>
+ <import>gatk_macros.xml</import>
+ </macros><command interpreter="python">gatk_wrapper.py
--max_jvm_heap_fraction "1"
--stdout "${output_log}"
https://bitbucket.org/galaxy/galaxy-central/commits/1857f97b6772/
Changeset: 1857f97b6772
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Create shared cheetah template variable for gatk standard options.
Affected #: 16 files
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/0be733a1029e/
Changeset: 0be733a1029e
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Refactor big GATK parameter type conditional into shared macro for 15 GATK tools.
Affected #: 16 files
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/83ff3052f4b6/
Changeset: 83ff3052f4b6
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Implement another macro type (token) that performs direct string substition on element text.
Affected #: 1 file
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/6ec03dde4504/
Changeset: 6ec03dde4504
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Extend token macro expansion to include XML attributes. Optimize expansion performance slightly.
Affected #: 1 file
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/ca5333e3daaf/
Changeset: ca5333e3daaf
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Refactor GATK wrappers to use token macros to eliminate duplication in citation section.
Affected #: 17 files
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/c28e4c6eced2/
Changeset: c28e4c6eced2
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Covert more duplicated GATK wrapper code to macros.
Affected #: 16 files
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/71caf4b3d201/
Changeset: 71caf4b3d201
User: jmchilton
Date: 2013-03-22 06:41:08
Summary: Remove unused code in lib/galaxy/tools/loader.py
Affected #: 1 file
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/2c39e7cb976f/
Changeset: 2c39e7cb976f
User: jmchilton
Date: 2013-03-31 19:39:20
Summary: Tool Macros: Added test case for doubly recursive macros. Implement bug fix and refactor to simplify things and make problem/fix more obvious.
Affected #: 1 file
Diff not available.
https://bitbucket.org/galaxy/galaxy-central/commits/8fc56b85e0a5/
Changeset: 8fc56b85e0a5
User: jgoecks
Date: 2013-04-08 23:10:21
Summary: Merged in galaxyp/galaxy-central-parallelism-refactorings (pull request #140)
Improvements to Tool XML Macroing System
Affected #: 19 files
Diff not available.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
2 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/commits/d7f37a2fe690/
Changeset: d7f37a2fe690
Branch: stable
User: natefoo
Date: 2013-04-08 18:28:46
Summary: Added tag security_2013.04.08 for changeset 2cc8d10988e0
Affected #: 1 file
diff -r 04c85ce163d3fb330c80fe61db4755cc446dd244 -r d7f37a2fe69089c97c4d121287736c5c15f37795 .hgtags
--- a/.hgtags
+++ b/.hgtags
@@ -1,3 +1,4 @@
a4113cc1cb5eaa68091c9a73375f00555b66dd11 release_2013.01.13
1c717491139269651bb59687563da9410b84c65d release_2013.02.08
75f09617abaadbc8cc732bb8ee519decaeb56ea7 release_2013.04.01
+2cc8d10988e03257dc7b97f8bb332c7df745d1dd security_2013.04.08
https://bitbucket.org/galaxy/galaxy-central/commits/788cd3d06541/
Changeset: 788cd3d06541
User: natefoo
Date: 2013-04-08 18:30:08
Summary: Merged stable.
Affected #: 1 file
diff -r 19f6e62bd372dc44e0d6b906fa2817122a5a57e4 -r 788cd3d065413b2611d82375a5d0d562775ea529 .hgtags
--- a/.hgtags
+++ b/.hgtags
@@ -1,3 +1,4 @@
a4113cc1cb5eaa68091c9a73375f00555b66dd11 release_2013.01.13
1c717491139269651bb59687563da9410b84c65d release_2013.02.08
75f09617abaadbc8cc732bb8ee519decaeb56ea7 release_2013.04.01
+2cc8d10988e03257dc7b97f8bb332c7df745d1dd security_2013.04.08
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0