galaxy-commits
Threads by month
- ----- 2025 -----
- July
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- 15302 discussions
3 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/489e80f86187/
changeset: 489e80f86187
user: jmchilton
date: 2012-09-26 04:49:22
summary: Ease tool development by reducing clicking/keystorkes required by developers to repeatedly reload the same tool via admin console.
affected #: 2 files
diff -r 20144ac9ffa7d15a9738cee2a5c8cb2b4cf1c401 -r 489e80f8618766c5ea3d655adaaea5e6b13d3bc2 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -1578,10 +1578,12 @@
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
toolbox = self.app.toolbox
+ tool_id = None
if params.get( 'reload_tool_button', False ):
tool_id = params.tool_id
message, status = toolbox.reload_tool_by_id( tool_id )
return trans.fill_template( '/admin/reload_tool.mako',
+ tool_id=tool_id,
toolbox=toolbox,
message=message,
status=status )
diff -r 20144ac9ffa7d15a9738cee2a5c8cb2b4cf1c401 -r 489e80f8618766c5ea3d655adaaea5e6b13d3bc2 templates/admin/reload_tool.mako
--- a/templates/admin/reload_tool.mako
+++ b/templates/admin/reload_tool.mako
@@ -1,6 +1,17 @@
<%inherit file="/base.mako"/><%namespace file="/message.mako" import="render_msg" />
+<script type="text/javascript">
+$().ready(function() {
+%if tool_id:
+ var focus_el = $("input[name=reload_tool_button]");
+%else:
+ var focus_el = $("select[name=tool_id]");
+%endif
+ focus_el.focus();
+});
+</script>
+
%if message:
${render_msg( message, status )}
%endif
@@ -22,7 +33,11 @@
<% section = val %>
%for section_key, section_val in section.elems.items():
%if section_key.startswith( 'tool' ):
- <option value="${section_val.id}">${section_val.name}</option>
+ <% selected_str = "" %>
+ %if section_val.id == tool_id:
+ <% selected_str = " selected=\"selected\"" %>
+ %endif
+ <option value="${section_val.id}"${selected_str}>${section_val.name}</option>
%endif
%endfor
%endif
https://bitbucket.org/galaxy/galaxy-central/changeset/a0b1af3a160f/
changeset: a0b1af3a160f
user: jmchilton
date: 2012-09-29 03:15:56
summary: Merge with galaxy-central changes.
affected #: 232 files
Diff too large to display.
https://bitbucket.org/galaxy/galaxy-central/changeset/869bcca8756a/
changeset: 869bcca8756a
user: dannon
date: 2012-10-24 16:54:37
summary: Merged in jmchilton/galaxy-central-reload-tool-improvements (pull request #72)
affected #: 3 files
diff -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f -r 869bcca8756a32b0b2c482f1ea67623cce110051 lib/galaxy/web/base/controllers/admin.py
--- a/lib/galaxy/web/base/controllers/admin.py
+++ b/lib/galaxy/web/base/controllers/admin.py
@@ -56,10 +56,12 @@
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
toolbox = self.app.toolbox
+ tool_id = None
if params.get( 'reload_tool_button', False ):
tool_id = params.tool_id
message, status = toolbox.reload_tool_by_id( tool_id )
return trans.fill_template( '/admin/reload_tool.mako',
+ tool_id=tool_id,
toolbox=toolbox,
message=message,
status=status )
@@ -1110,4 +1112,4 @@
# Load user from database
id = trans.security.decode_id( id )
quota = trans.sa_session.query( trans.model.Quota ).get( id )
- return quota
\ No newline at end of file
+ return quota
diff -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f -r 869bcca8756a32b0b2c482f1ea67623cce110051 templates/admin/reload_tool.mako
--- a/templates/admin/reload_tool.mako
+++ b/templates/admin/reload_tool.mako
@@ -1,6 +1,17 @@
<%inherit file="/base.mako"/><%namespace file="/message.mako" import="render_msg" />
+<script type="text/javascript">
+$().ready(function() {
+%if tool_id:
+ var focus_el = $("input[name=reload_tool_button]");
+%else:
+ var focus_el = $("select[name=tool_id]");
+%endif
+ focus_el.focus();
+});
+</script>
+
%if message:
${render_msg( message, status )}
%endif
@@ -22,7 +33,11 @@
<% section = val %>
%for section_key, section_val in section.elems.items():
%if section_key.startswith( 'tool' ):
- <option value="${section_val.id}">${section_val.name}</option>
+ <% selected_str = "" %>
+ %if section_val.id == tool_id:
+ <% selected_str = " selected=\"selected\"" %>
+ %endif
+ <option value="${section_val.id}"${selected_str}>${section_val.name}</option>
%endif
%endfor
%endif
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
11 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/8220718b1eeb/
changeset: 8220718b1eeb
user: jmchilton
date: 2012-10-08 01:22:49
summary: Fix running tools with multi-data parameters. (Extracting and running workflows still broken.)
affected #: 2 files
diff -r 05fc04a70a3bcbfaeedfcf6f2ec16a7e38fc7c94 -r 8220718b1eebdcb1257a9520f9aa9d23f7b2d3e4 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -2224,6 +2224,13 @@
values = input_values[ input.name ]
current = values["__current_case__"]
wrap_values( input.cases[current].inputs, values )
+ elif isinstance( input, DataToolParameter ) and input.multiple:
+ values = input_values[ input.name ]
+ input_values[ input.name ] = \
+ [DatasetFilenameWrapper( value,
+ datatypes_registry = self.app.datatypes_registry,
+ tool = self,
+ name = input.name ) for value in values]
elif isinstance( input, DataToolParameter ):
## FIXME: We're populating param_dict with conversions when
## wrapping values, this should happen as a separate
diff -r 05fc04a70a3bcbfaeedfcf6f2ec16a7e38fc7c94 -r 8220718b1eebdcb1257a9520f9aa9d23f7b2d3e4 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1545,9 +1545,30 @@
else:
return trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( value )
+
+ # TODO: Determine if this needs to be overridden. -John
+ def value_from_basic( self, value, app, ignore_errors=False ):
+ # HACK: Some things don't deal with unicode well, psycopg problem?
+ if type( value ) == unicode:
+ value = str( value )
+ # Handle Runtime values (valid for any parameter?)
+ if isinstance( value, dict ) and '__class__' in value and value['__class__'] == "RuntimeValue":
+ return RuntimeValue()
+ # Delegate to the 'to_python' method
+ if ignore_errors:
+ try:
+ return self.to_python( value, app )
+ except:
+ return value
+ else:
+ return self.to_python( value, app )
+
+
def to_string( self, value, app ):
if value is None or isinstance( value, str ):
return value
+ elif isinstance( value, list ):
+ return ",".join( [ val if isinstance( val, str ) else str(val.id) for val in value] )
elif isinstance( value, DummyDataset ):
return None
return value.id
@@ -1557,6 +1578,10 @@
# indicates that the dataset is optional, while '' indicates that it is not.
if value is None or value == '' or value == 'None':
return value
+ if isinstance(value, str) and value.find(",") > -1:
+ values = value.split(",")
+ # TODO: Optimize. -John
+ return [app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( val ) ) for val in values]
return app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( value ) )
def to_param_dict_string( self, value, other_values={} ):
https://bitbucket.org/galaxy/galaxy-central/changeset/4a3d4de46132/
changeset: 4a3d4de46132
user: jmchilton
date: 2012-10-08 01:29:54
summary: Fix extracting workflows containing tools with multi-input parameters.
affected #: 1 file
diff -r 8220718b1eebdcb1257a9520f9aa9d23f7b2d3e4 -r 4a3d4de46132de8e353b48640512fdefab7ea7ca lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -2054,7 +2054,11 @@
# still need to clean them up so we can serialize
# if not( prefix ):
if tmp: #this is false for a non-set optional dataset
- associations.append( ( tmp.hid, prefix + key ) )
+ if not isinstance(tmp, list):
+ associations.append( ( tmp.hid, prefix + key ) )
+ else:
+ associations.extend( [ (t.hid, prefix + key) for t in tmp] )
+
# Cleanup the other deprecated crap associated with datasets
# as well. Worse, for nested datasets all the metadata is
# being pushed into the root. FIXME: MUST REMOVE SOON
https://bitbucket.org/galaxy/galaxy-central/changeset/65024c60d545/
changeset: 65024c60d545
user: jmchilton
date: 2012-10-08 02:36:46
summary: Progress toward multi-inputs: initial run workflow input page renders, workflows still do not execute, standard workflows seem to continue to work.
affected #: 3 files
diff -r 4a3d4de46132de8e353b48640512fdefab7ea7ca -r 65024c60d5452cad3d410b41046a472ba3109e41 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1567,10 +1567,12 @@
def to_string( self, value, app ):
if value is None or isinstance( value, str ):
return value
+ elif isinstance( value, DummyDataset ):
+ return None
+ elif isinstance( value, list) and len(value) > 0 and isinstance( value[0], DummyDataset):
+ return None
elif isinstance( value, list ):
return ",".join( [ val if isinstance( val, str ) else str(val.id) for val in value] )
- elif isinstance( value, DummyDataset ):
- return None
return value.id
def to_python( self, value, app ):
diff -r 4a3d4de46132de8e353b48640512fdefab7ea7ca -r 65024c60d5452cad3d410b41046a472ba3109e41 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1362,6 +1362,13 @@
# Connections by input name
step.input_connections_by_name = \
dict( ( conn.input_name, conn ) for conn in step.input_connections )
+ input_connections_by_name = {}
+ for conn in step.input_connections:
+ input_name = conn.input_name
+ if not input_name in input_connections_by_name:
+ input_connections_by_name[input_name] = []
+ input_connections_by_name[input_name].append(conn)
+ step.input_connections_by_name = input_connections_by_name
# Extract just the arguments for this step by prefix
p = "%s|" % step.id
l = len(p)
@@ -1419,10 +1426,15 @@
tool = trans.app.toolbox.get_tool( step.tool_id )
# Connect up
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if prefixed_name in step.input_connections_by_name:
conn = step.input_connections_by_name[ prefixed_name ]
- return outputs[ conn.output_step.id ][ conn.output_name ]
+ if input.multiple:
+ replacements = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
+ else:
+ replacement = outputs[ conn[0].output_step.id ][ conn[0].output_name ]
+ return replacement
try:
visit_input_values( tool.inputs, step.state.inputs, callback )
except KeyError, k:
diff -r 4a3d4de46132de8e353b48640512fdefab7ea7ca -r 65024c60d5452cad3d410b41046a472ba3109e41 lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -336,9 +336,14 @@
# Any connected input needs to have value DummyDataset (these
# are not persisted so we need to do it every time)
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if connections is None or prefixed_name in input_connections_by_name:
- return DummyDataset()
+ if input.multiple:
+ replacement = [DummyDataset() for conn in connections]
+ else:
+ replacement = DummyDataset()
+ return replacement
visit_input_values( self.tool.inputs, self.state.inputs, callback )
class WorkflowModuleFactory( object ):
https://bitbucket.org/galaxy/galaxy-central/changeset/565510047933/
changeset: 565510047933
user: jmchilton
date: 2012-10-08 02:49:02
summary: Cleanup some artifacts from last few multi-input commits.
affected #: 2 files
diff -r 65024c60d5452cad3d410b41046a472ba3109e41 -r 56551004793378fb618fcfc0214669f307dea594 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1545,25 +1545,6 @@
else:
return trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( value )
-
- # TODO: Determine if this needs to be overridden. -John
- def value_from_basic( self, value, app, ignore_errors=False ):
- # HACK: Some things don't deal with unicode well, psycopg problem?
- if type( value ) == unicode:
- value = str( value )
- # Handle Runtime values (valid for any parameter?)
- if isinstance( value, dict ) and '__class__' in value and value['__class__'] == "RuntimeValue":
- return RuntimeValue()
- # Delegate to the 'to_python' method
- if ignore_errors:
- try:
- return self.to_python( value, app )
- except:
- return value
- else:
- return self.to_python( value, app )
-
-
def to_string( self, value, app ):
if value is None or isinstance( value, str ):
return value
diff -r 65024c60d5452cad3d410b41046a472ba3109e41 -r 56551004793378fb618fcfc0214669f307dea594 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1360,8 +1360,6 @@
for step in workflow.steps:
step.upgrade_messages = {}
# Connections by input name
- step.input_connections_by_name = \
- dict( ( conn.input_name, conn ) for conn in step.input_connections )
input_connections_by_name = {}
for conn in step.input_connections:
input_name = conn.input_name
https://bitbucket.org/galaxy/galaxy-central/changeset/709ce65ca6b2/
changeset: 709ce65ca6b2
user: jmchilton
date: 2012-10-08 03:19:40
summary: Fix a typo, now workflows contain multi-dataset inputs run.
affected #: 1 file
diff -r 56551004793378fb618fcfc0214669f307dea594 -r 709ce65ca6b29b15e89a0913546ffd67e1d3c19d lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1429,11 +1429,12 @@
if prefixed_name in step.input_connections_by_name:
conn = step.input_connections_by_name[ prefixed_name ]
if input.multiple:
- replacements = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
+ replacement = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
else:
replacement = outputs[ conn[0].output_step.id ][ conn[0].output_name ]
return replacement
try:
+ # Replace DummyDatasets with historydatasetassociations
visit_input_values( tool.inputs, step.state.inputs, callback )
except KeyError, k:
error( "Error due to input mapping of '%s' in '%s'. A common cause of this is conditional outputs that cannot be determined until runtime, please review your workflow." % (tool.name, k.message))
https://bitbucket.org/galaxy/galaxy-central/changeset/bbeb3f333beb/
changeset: bbeb3f333beb
user: jmchilton
date: 2012-10-12 19:01:06
summary: Fix from Jim Johnson for rerunning jobs with multi-input data parameters.
affected #: 1 file
diff -r 709ce65ca6b29b15e89a0913546ffd67e1d3c19d -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b lib/galaxy/webapps/galaxy/controllers/tool_runner.py
--- a/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
+++ b/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
@@ -199,6 +199,12 @@
if isinstance( value, UnvalidatedValue ):
return str( value )
if isinstance( input, DataToolParameter ):
+ if isinstance(value,list):
+ values = []
+ for val in value:
+ if val not in history.datasets and val in hda_source_dict:
+ values.append( hda_source_dict[ val ])
+ return values
if value not in history.datasets and value in hda_source_dict:
return hda_source_dict[ value ]
visit_input_values( tool.inputs, params_objects, rerun_callback )
https://bitbucket.org/galaxy/galaxy-central/changeset/db6cabd6760c/
changeset: db6cabd6760c
user: jmchilton
date: 2012-10-16 16:13:32
summary: Initial work from JJ on enhancing the workflow editor to allow multiple input data parameters. Workflows containing such connections will now display properly, though they cannot be modified.
affected #: 3 files
diff -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -741,12 +741,14 @@
}
# Connections
input_connections = step.input_connections
+ multiple_input = {} # Boolean value indicating if this can be mutliple
if step.type is None or step.type == 'tool':
# Determine full (prefixed) names of valid input datasets
data_input_names = {}
def callback( input, value, prefixed_name, prefixed_label ):
if isinstance( input, DataToolParameter ):
data_input_names[ prefixed_name ] = True
+ multiple_input[input.name] = input.multiple
visit_input_values( module.tool.inputs, module.state.inputs, callback )
# Filter
# FIXME: this removes connection without displaying a message currently!
@@ -766,8 +768,14 @@
# Encode input connections as dictionary
input_conn_dict = {}
for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ conn_dict = dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ if conn.input_name in multiple_input:
+ if conn.input_name in input_conn_dict:
+ input_conn_dict[ conn.input_name ].append( conn_dict )
+ else:
+ input_conn_dict[ conn.input_name ] = [ conn_dict ]
+ else:
+ input_conn_dict[ conn.input_name ] = conn_dict
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -832,13 +840,15 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
- conn = model.WorkflowStepConnection()
- conn.input_step = step
- conn.input_name = input_name
- conn.output_name = conn_dict['output_name']
- conn.output_step = steps_by_external_id[ conn_dict['id'] ]
+ for input_name, conns in step.temp_input_connections.iteritems():
+ if conns:
+ conn_dicts = conns if isinstance(conns,list) else [conns]
+ for conn_dict in conn_dicts:
+ conn = model.WorkflowStepConnection()
+ conn.input_step = step
+ conn.input_name = input_name
+ conn.output_name = conn_dict['output_name']
+ conn.output_step = steps_by_external_id[ conn_dict['id'] ]
del step.temp_input_connections
# Order the steps if possible
attach_ordered_steps( workflow, steps )
diff -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -263,6 +263,7 @@
data_inputs.append( dict(
name=prefixed_name,
label=prefixed_label,
+ multiple=input.multiple,
extensions=input.extensions ) )
visit_input_values( self.tool.inputs, self.state.inputs, callback )
return data_inputs
@@ -340,7 +341,7 @@
if isinstance( input, DataToolParameter ):
if connections is None or prefixed_name in input_connections_by_name:
if input.multiple:
- replacement = [DummyDataset() for conn in connections]
+ replacement = [] if not connections else [DummyDataset() for conn in connections]
else:
replacement = DummyDataset()
return replacement
diff -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 static/scripts/galaxy.workflow_editor.canvas.js
--- a/static/scripts/galaxy.workflow_editor.canvas.js
+++ b/static/scripts/galaxy.workflow_editor.canvas.js
@@ -34,16 +34,17 @@
OutputTerminal.prototype = new Terminal();
-function InputTerminal( element, datatypes ) {
+function InputTerminal( element, datatypes, multiple ) {
Terminal.call( this, element );
this.datatypes = datatypes;
+ this.multiple = multiple
}
InputTerminal.prototype = new Terminal();
$.extend( InputTerminal.prototype, {
can_accept: function ( other ) {
- if ( this.connectors.length < 1 ) {
+ if ( this.connectors.length < 1 || this.multiple) {
for ( var t in this.datatypes ) {
var cat_outputs = new Array();
cat_outputs = cat_outputs.concat(other.datatypes);
@@ -163,10 +164,10 @@
this.tool_errors = {};
}
$.extend( Node.prototype, {
- enable_input_terminal : function( elements, name, types ) {
+ enable_input_terminal : function( elements, name, types, multiple ) {
var node = this;
$(elements).each( function() {
- var terminal = this.terminal = new InputTerminal( this, types );
+ var terminal = this.terminal = new InputTerminal( this, types, multiple );
terminal.node = node;
terminal.name = name;
$(this).bind( "dropinit", function( e, d ) {
@@ -304,7 +305,7 @@
var ibox = $("<div class='inputs'></div>").appendTo( b );
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
var ib = $("<div class='form-row dataRow input-data-row' name='" + input.name + "'>" + input.label + "</div>" );
ib.css({ position:'absolute',
left: -1000,
@@ -407,7 +408,7 @@
var old = old_body.find( "div.input-data-row");
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
// If already connected save old connection
old_body.find( "div[name='" + input.name + "']" ).each( function() {
$(this).find( ".input-terminal" ).each( function() {
@@ -545,8 +546,10 @@
input_connections[ t.name ] = null;
// There should only be 0 or 1 connectors, so this is
// really a sneaky if statement
+ var cons = []
$.each( t.connectors, function ( i, c ) {
- input_connections[ t.name ] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ cons[i] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ input_connections[ t.name ] = cons;
});
});
var post_job_actions = {};
@@ -617,11 +620,21 @@
var node = wf.nodes[id];
$.each( step.input_connections, function( k, v ) {
if ( v ) {
- var other_node = wf.nodes[ v.id ];
- var c = new Connector();
- c.connect( other_node.output_terminals[ v.output_name ],
- node.input_terminals[ k ] );
- c.redraw();
+ if ($.isArray(v)) {
+ $.each( v, function (l,x ) {
+ var other_node = wf.nodes[ x.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ x.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ });
+ } else {
+ var other_node = wf.nodes[ v.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ v.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ }
}
});
if(using_workflow_outputs && node.type === 'tool'){
https://bitbucket.org/galaxy/galaxy-central/changeset/b9477d8f7050/
changeset: b9477d8f7050
user: jjohnson
date: 2012-10-16 18:47:57
summary: Fix workflow_editor exceptions for multi-input when dragging
affected #: 1 file
diff -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 -r b9477d8f7050cf90786e749ce096667a7cfc4725 static/scripts/galaxy.workflow_editor.canvas.js
--- a/static/scripts/galaxy.workflow_editor.canvas.js
+++ b/static/scripts/galaxy.workflow_editor.canvas.js
@@ -112,6 +112,9 @@
var relativeTop = function( e ) {
return $(e).offset().top - canvas_container.offset().top;
};
+ if (!this.handle1 || !this.handle2) {
+ return;
+ }
// Find the position of each handle
var start_x = relativeLeft( this.handle1.element ) + 5;
var start_y = relativeTop( this.handle1.element ) + 5;
@@ -175,9 +178,13 @@
// compatible type
return $(d.drag).hasClass( "output-terminal" ) && terminal.can_accept( d.drag.terminal );
}).bind( "dropstart", function( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ }
}).bind( "dropend", function ( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ }
}).bind( "drop", function( e, d ) {
( new Connector( d.drag.terminal, terminal ) ).redraw();
}).bind( "hover", function() {
@@ -191,7 +198,9 @@
$("<div class='buttons'></div>").append(
$("<img/>").attr("src", galaxy_paths.attributes.image_path + '/delete_icon.png').click( function() {
$.each( terminal.connectors, function( _, x ) {
- x.destroy();
+ if (x) {
+ x.destroy();
+ }
});
t.remove();
})))
https://bitbucket.org/galaxy/galaxy-central/changeset/eec42e84733b/
changeset: eec42e84733b
user: jmchilton
date: 2012-10-17 16:53:15
summary: Fix workflow imports/exports for multiple input data parameters.
affected #: 1 file
diff -r b9477d8f7050cf90786e749ce096667a7cfc4725 -r eec42e84733b60c98c388ebf3291636fe0cfb15a lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1761,9 +1761,22 @@
input_connections = [ conn for conn in input_connections if conn.input_name in data_input_names ]
# Encode input connections as dictionary
input_conn_dict = {}
- for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ unique_input_names = set( [conn.input_name for conn in input_connections] )
+ for input_name in unique_input_names:
+ input_conn_dict[ input_name ] = \
+ [ dict( id=conn.output_step.order_index, output_name=conn.output_name ) for conn in input_connections if conn.input_name == input_name ]
+ # Preserve backward compatability. Previously Galaxy
+ # assumed input connections would be dictionaries not
+ # lists of dictionaries, so replace any singleton list
+ # with just the dictionary so that workflows exported from
+ # newer Galaxy instances can be used with older Galaxy
+ # instances if they do no include multiple input
+ # tools. This should be removed at some point. Mirrored
+ # hack in _workflow_from_dict should never be removed so
+ # existing workflow exports continue to function.
+ for input_name, input_conn in dict(input_conn_dict).iteritems():
+ if len(input_conn) == 1:
+ input_conn_dict[input_name] = input_conn[0]
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -1828,8 +1841,12 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
+ for input_name, conn_list in step.temp_input_connections.iteritems():
+ if not conn_list:
+ continue
+ if not isinstance(conn_list, list): # Older style singleton connection
+ conn_list = [conn_list]
+ for conn_dict in conn_list:
conn = model.WorkflowStepConnection()
conn.input_step = step
conn.input_name = input_name
https://bitbucket.org/galaxy/galaxy-central/changeset/ba74363c65be/
changeset: ba74363c65be
user: dannon
date: 2012-10-24 15:00:30
summary: Merge
affected #: 6 files
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -2245,6 +2245,13 @@
values = input_values[ input.name ]
current = values["__current_case__"]
wrap_values( input.cases[current].inputs, values )
+ elif isinstance( input, DataToolParameter ) and input.multiple:
+ values = input_values[ input.name ]
+ input_values[ input.name ] = \
+ [DatasetFilenameWrapper( value,
+ datatypes_registry = self.app.datatypes_registry,
+ tool = self,
+ name = input.name ) for value in values]
elif isinstance( input, DataToolParameter ):
## FIXME: We're populating param_dict with conversions when
## wrapping values, this should happen as a separate
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1550,6 +1550,10 @@
return value
elif isinstance( value, DummyDataset ):
return None
+ elif isinstance( value, list) and len(value) > 0 and isinstance( value[0], DummyDataset):
+ return None
+ elif isinstance( value, list ):
+ return ",".join( [ val if isinstance( val, str ) else str(val.id) for val in value] )
return value.id
def to_python( self, value, app ):
@@ -1557,6 +1561,10 @@
# indicates that the dataset is optional, while '' indicates that it is not.
if value is None or value == '' or value == 'None':
return value
+ if isinstance(value, str) and value.find(",") > -1:
+ values = value.split(",")
+ # TODO: Optimize. -John
+ return [app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( val ) ) for val in values]
return app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( value ) )
def to_param_dict_string( self, value, other_values={} ):
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/webapps/galaxy/controllers/tool_runner.py
--- a/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
+++ b/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
@@ -189,6 +189,12 @@
if isinstance( value, UnvalidatedValue ):
return str( value )
if isinstance( input, DataToolParameter ):
+ if isinstance(value,list):
+ values = []
+ for val in value:
+ if val not in history.datasets and val in hda_source_dict:
+ values.append( hda_source_dict[ val ])
+ return values
if value not in history.datasets and value in hda_source_dict:
return hda_source_dict[ value ]
visit_input_values( tool.inputs, params_objects, rerun_callback )
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -741,12 +741,14 @@
}
# Connections
input_connections = step.input_connections
+ multiple_input = {} # Boolean value indicating if this can be mutliple
if step.type is None or step.type == 'tool':
# Determine full (prefixed) names of valid input datasets
data_input_names = {}
def callback( input, value, prefixed_name, prefixed_label ):
if isinstance( input, DataToolParameter ):
data_input_names[ prefixed_name ] = True
+ multiple_input[input.name] = input.multiple
visit_input_values( module.tool.inputs, module.state.inputs, callback )
# Filter
# FIXME: this removes connection without displaying a message currently!
@@ -766,8 +768,14 @@
# Encode input connections as dictionary
input_conn_dict = {}
for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ conn_dict = dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ if conn.input_name in multiple_input:
+ if conn.input_name in input_conn_dict:
+ input_conn_dict[ conn.input_name ].append( conn_dict )
+ else:
+ input_conn_dict[ conn.input_name ] = [ conn_dict ]
+ else:
+ input_conn_dict[ conn.input_name ] = conn_dict
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -832,13 +840,15 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
- conn = model.WorkflowStepConnection()
- conn.input_step = step
- conn.input_name = input_name
- conn.output_name = conn_dict['output_name']
- conn.output_step = steps_by_external_id[ conn_dict['id'] ]
+ for input_name, conns in step.temp_input_connections.iteritems():
+ if conns:
+ conn_dicts = conns if isinstance(conns,list) else [conns]
+ for conn_dict in conn_dicts:
+ conn = model.WorkflowStepConnection()
+ conn.input_step = step
+ conn.input_name = input_name
+ conn.output_name = conn_dict['output_name']
+ conn.output_step = steps_by_external_id[ conn_dict['id'] ]
del step.temp_input_connections
# Order the steps if possible
attach_ordered_steps( workflow, steps )
@@ -1346,8 +1356,13 @@
for step in workflow.steps:
step.upgrade_messages = {}
# Connections by input name
- step.input_connections_by_name = \
- dict( ( conn.input_name, conn ) for conn in step.input_connections )
+ input_connections_by_name = {}
+ for conn in step.input_connections:
+ input_name = conn.input_name
+ if not input_name in input_connections_by_name:
+ input_connections_by_name[input_name] = []
+ input_connections_by_name[input_name].append(conn)
+ step.input_connections_by_name = input_connections_by_name
# Extract just the arguments for this step by prefix
p = "%s|" % step.id
l = len(p)
@@ -1405,11 +1420,17 @@
tool = trans.app.toolbox.get_tool( step.tool_id )
# Connect up
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if prefixed_name in step.input_connections_by_name:
conn = step.input_connections_by_name[ prefixed_name ]
- return outputs[ conn.output_step.id ][ conn.output_name ]
+ if input.multiple:
+ replacement = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
+ else:
+ replacement = outputs[ conn[0].output_step.id ][ conn[0].output_name ]
+ return replacement
try:
+ # Replace DummyDatasets with historydatasetassociations
visit_input_values( tool.inputs, step.state.inputs, callback )
except KeyError, k:
error( "Error due to input mapping of '%s' in '%s'. A common cause of this is conditional outputs that cannot be determined until runtime, please review your workflow." % (tool.name, k.message))
@@ -1726,9 +1747,22 @@
input_connections = [ conn for conn in input_connections if conn.input_name in data_input_names ]
# Encode input connections as dictionary
input_conn_dict = {}
- for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ unique_input_names = set( [conn.input_name for conn in input_connections] )
+ for input_name in unique_input_names:
+ input_conn_dict[ input_name ] = \
+ [ dict( id=conn.output_step.order_index, output_name=conn.output_name ) for conn in input_connections if conn.input_name == input_name ]
+ # Preserve backward compatability. Previously Galaxy
+ # assumed input connections would be dictionaries not
+ # lists of dictionaries, so replace any singleton list
+ # with just the dictionary so that workflows exported from
+ # newer Galaxy instances can be used with older Galaxy
+ # instances if they do no include multiple input
+ # tools. This should be removed at some point. Mirrored
+ # hack in _workflow_from_dict should never be removed so
+ # existing workflow exports continue to function.
+ for input_name, input_conn in dict(input_conn_dict).iteritems():
+ if len(input_conn) == 1:
+ input_conn_dict[input_name] = input_conn[0]
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -1793,8 +1827,12 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
+ for input_name, conn_list in step.temp_input_connections.iteritems():
+ if not conn_list:
+ continue
+ if not isinstance(conn_list, list): # Older style singleton connection
+ conn_list = [conn_list]
+ for conn_dict in conn_list:
conn = model.WorkflowStepConnection()
conn.input_step = step
conn.input_name = input_name
@@ -2040,7 +2078,11 @@
# still need to clean them up so we can serialize
# if not( prefix ):
if tmp: #this is false for a non-set optional dataset
- associations.append( ( tmp.hid, prefix + key ) )
+ if not isinstance(tmp, list):
+ associations.append( ( tmp.hid, prefix + key ) )
+ else:
+ associations.extend( [ (t.hid, prefix + key) for t in tmp] )
+
# Cleanup the other deprecated crap associated with datasets
# as well. Worse, for nested datasets all the metadata is
# being pushed into the root. FIXME: MUST REMOVE SOON
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -263,6 +263,7 @@
data_inputs.append( dict(
name=prefixed_name,
label=prefixed_label,
+ multiple=input.multiple,
extensions=input.extensions ) )
visit_input_values( self.tool.inputs, self.state.inputs, callback )
return data_inputs
@@ -336,9 +337,14 @@
# Any connected input needs to have value DummyDataset (these
# are not persisted so we need to do it every time)
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if connections is None or prefixed_name in input_connections_by_name:
- return DummyDataset()
+ if input.multiple:
+ replacement = [] if not connections else [DummyDataset() for conn in connections]
+ else:
+ replacement = DummyDataset()
+ return replacement
visit_input_values( self.tool.inputs, self.state.inputs, callback )
class WorkflowModuleFactory( object ):
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 static/scripts/galaxy.workflow_editor.canvas.js
--- a/static/scripts/galaxy.workflow_editor.canvas.js
+++ b/static/scripts/galaxy.workflow_editor.canvas.js
@@ -34,16 +34,17 @@
OutputTerminal.prototype = new Terminal();
-function InputTerminal( element, datatypes ) {
+function InputTerminal( element, datatypes, multiple ) {
Terminal.call( this, element );
this.datatypes = datatypes;
+ this.multiple = multiple
}
InputTerminal.prototype = new Terminal();
$.extend( InputTerminal.prototype, {
can_accept: function ( other ) {
- if ( this.connectors.length < 1 ) {
+ if ( this.connectors.length < 1 || this.multiple) {
for ( var t in this.datatypes ) {
var cat_outputs = new Array();
cat_outputs = cat_outputs.concat(other.datatypes);
@@ -111,6 +112,9 @@
var relativeTop = function( e ) {
return $(e).offset().top - canvas_container.offset().top;
};
+ if (!this.handle1 || !this.handle2) {
+ return;
+ }
// Find the position of each handle
var start_x = relativeLeft( this.handle1.element ) + 5;
var start_y = relativeTop( this.handle1.element ) + 5;
@@ -163,10 +167,10 @@
this.tool_errors = {};
}
$.extend( Node.prototype, {
- enable_input_terminal : function( elements, name, types ) {
+ enable_input_terminal : function( elements, name, types, multiple ) {
var node = this;
$(elements).each( function() {
- var terminal = this.terminal = new InputTerminal( this, types );
+ var terminal = this.terminal = new InputTerminal( this, types, multiple );
terminal.node = node;
terminal.name = name;
$(this).bind( "dropinit", function( e, d ) {
@@ -174,9 +178,13 @@
// compatible type
return $(d.drag).hasClass( "output-terminal" ) && terminal.can_accept( d.drag.terminal );
}).bind( "dropstart", function( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ }
}).bind( "dropend", function ( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ }
}).bind( "drop", function( e, d ) {
( new Connector( d.drag.terminal, terminal ) ).redraw();
}).bind( "hover", function() {
@@ -190,7 +198,9 @@
$("<div class='buttons'></div>").append(
$("<img/>").attr("src", galaxy_paths.attributes.image_path + '/delete_icon.png').click( function() {
$.each( terminal.connectors, function( _, x ) {
- x.destroy();
+ if (x) {
+ x.destroy();
+ }
});
t.remove();
})))
@@ -304,7 +314,7 @@
var ibox = $("<div class='inputs'></div>").appendTo( b );
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
var ib = $("<div class='form-row dataRow input-data-row' name='" + input.name + "'>" + input.label + "</div>" );
ib.css({ position:'absolute',
left: -1000,
@@ -407,7 +417,7 @@
var old = old_body.find( "div.input-data-row");
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
// If already connected save old connection
old_body.find( "div[name='" + input.name + "']" ).each( function() {
$(this).find( ".input-terminal" ).each( function() {
@@ -545,8 +555,10 @@
input_connections[ t.name ] = null;
// There should only be 0 or 1 connectors, so this is
// really a sneaky if statement
+ var cons = []
$.each( t.connectors, function ( i, c ) {
- input_connections[ t.name ] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ cons[i] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ input_connections[ t.name ] = cons;
});
});
var post_job_actions = {};
@@ -617,11 +629,21 @@
var node = wf.nodes[id];
$.each( step.input_connections, function( k, v ) {
if ( v ) {
- var other_node = wf.nodes[ v.id ];
- var c = new Connector();
- c.connect( other_node.output_terminals[ v.output_name ],
- node.input_terminals[ k ] );
- c.redraw();
+ if ($.isArray(v)) {
+ $.each( v, function (l,x ) {
+ var other_node = wf.nodes[ x.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ x.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ });
+ } else {
+ var other_node = wf.nodes[ v.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ v.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ }
}
});
if(using_workflow_outputs && node.type === 'tool'){
https://bitbucket.org/galaxy/galaxy-central/changeset/10342b1b167a/
changeset: 10342b1b167a
user: dannon
date: 2012-10-24 16:31:43
summary: Merge
affected #: 4 files
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/web/framework/helpers/grids.py
--- a/lib/galaxy/web/framework/helpers/grids.py
+++ b/lib/galaxy/web/framework/helpers/grids.py
@@ -418,6 +418,13 @@
def sort( self, trans, query, ascending, column_name=None ):
"""Sort query using this column."""
return GridColumn.sort( self, trans, query, ascending, column_name=column_name )
+ def get_single_filter( self, user, a_filter ):
+ if self.key.find( '.' ) > -1:
+ a_key = self.key.split( '.' )[1]
+ else:
+ a_key = self.key
+ model_class_key_field = getattr( self.model_class, a_key )
+ return model_class_key_field == a_filter
class IntegerColumn( TextColumn ):
"""
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/webapps/community/controllers/admin.py
--- a/lib/galaxy/webapps/community/controllers/admin.py
+++ b/lib/galaxy/webapps/community/controllers/admin.py
@@ -292,28 +292,25 @@
]
class AdminRepositoryGrid( RepositoryGrid ):
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository ):
+ if repository.deleted:
+ return 'yes'
+ return ''
columns = [ RepositoryGrid.NameColumn( "Name",
key="name",
link=( lambda item: dict( operation="view_or_manage_repository", id=item.id ) ),
attach_popup=True ),
- RepositoryGrid.DescriptionColumn( "Synopsis",
- key="description",
- attach_popup=False ),
- RepositoryGrid.MetadataRevisionColumn( "Metadata Revisions" ),
RepositoryGrid.UserColumn( "Owner",
model_class=model.User,
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- RepositoryGrid.EmailAlertsColumn( "Alert", attach_popup=False ),
- RepositoryGrid.DeprecatedColumn( "Deprecated", attach_popup=False ),
+ RepositoryGrid.DeprecatedColumn( "Deprecated", key="deprecated", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" ) ]
- columns.append( grids.MulticolFilterColumn( "Search repository name, description",
- cols_to_filter=[ columns[0], columns[1] ],
+ DeletedColumn( "Deleted", key="deleted", attach_popup=False ) ]
+ columns.append( grids.MulticolFilterColumn( "Search repository name",
+ cols_to_filter=[ columns[0] ],
key="free-text-search",
visible=False,
filterable="standard" ) )
@@ -327,6 +324,10 @@
condition=( lambda item: item.deleted ),
async_compatible=False ) )
standard_filters = []
+ default_filter = {}
+ def build_initial_query( self, trans, **kwd ):
+ return trans.sa_session.query( model.Repository ) \
+ .join( model.User.table )
class RepositoryMetadataGrid( grids.Grid ):
class IdColumn( grids.IntegerColumn ):
@@ -335,6 +336,9 @@
class NameColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
return repository_metadata.repository.name
+ class OwnerColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ return repository_metadata.repository.user.username
class RevisionColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
repository = repository_metadata.repository
@@ -343,42 +347,60 @@
return "%s:%s" % ( str( ctx.rev() ), repository_metadata.changeset_revision )
class ToolsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- tools_str = ''
+ tools_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'tools' in metadata:
- for tool_metadata_dict in metadata[ 'tools' ]:
- tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ # We used to display the following, but grid was too cluttered.
+ #for tool_metadata_dict in metadata[ 'tools' ]:
+ # tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ return '%d' % len( metadata[ 'tools' ] )
return tools_str
class DatatypesColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- datatypes_str = ''
+ datatypes_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'datatypes' in metadata:
- for datatype_metadata_dict in metadata[ 'datatypes' ]:
- datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ # We used to display the following, but grid was too cluttered.
+ #for datatype_metadata_dict in metadata[ 'datatypes' ]:
+ # datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ return '%d' % len( metadata[ 'datatypes' ] )
return datatypes_str
class WorkflowsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- workflows_str = ''
+ workflows_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'workflows' in metadata:
- workflows_str += '<b>Workflows:</b><br/>'
+ # We used to display the following, but grid was too cluttered.
+ #workflows_str += '<b>Workflows:</b><br/>'
# metadata[ 'workflows' ] is a list of tuples where each contained tuple is
# [ <relative path to the .ga file in the repository>, <exported workflow dict> ]
- workflow_tups = metadata[ 'workflows' ]
- workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
- for workflow_metadata_dict in workflow_metadata_dicts:
- workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ #workflow_tups = metadata[ 'workflows' ]
+ #workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
+ #for workflow_metadata_dict in workflow_metadata_dicts:
+ # workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ return '%d' % len( metadata[ 'workflows' ] )
return workflows_str
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deleted:
+ return 'yes'
+ return ''
+ class DeprecatedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deprecated:
+ return 'yes'
+ return ''
class MaliciousColumn( grids.BooleanColumn ):
def get_value( self, trans, grid, repository_metadata ):
- return repository_metadata.malicious
+ if repository_metadata.malicious:
+ return 'yes'
+ return ''
# Grid definition
title = "Repository Metadata"
model_class = model.RepositoryMetadata
@@ -393,16 +415,14 @@
model_class=model.Repository,
link=( lambda item: dict( operation="view_or_manage_repository_revision", id=item.id ) ),
attach_popup=True ),
- RevisionColumn( "Revision",
- attach_popup=False ),
- ToolsColumn( "Tools",
- attach_popup=False ),
- DatatypesColumn( "Datatypes",
- attach_popup=False ),
- WorkflowsColumn( "Workflows",
- attach_popup=False ),
- MaliciousColumn( "Malicious",
- attach_popup=False )
+ OwnerColumn( "Owner", attach_popup=False ),
+ RevisionColumn( "Revision", attach_popup=False ),
+ ToolsColumn( "Tools", attach_popup=False ),
+ DatatypesColumn( "Datatypes", attach_popup=False ),
+ WorkflowsColumn( "Workflows", attach_popup=False ),
+ DeletedColumn( "Deleted", attach_popup=False ),
+ DeprecatedColumn( "Deprecated", attach_popup=False ),
+ MaliciousColumn( "Malicious", attach_popup=False )
]
operations = [ grids.GridOperation( "Delete",
allow_multiple=False,
@@ -416,8 +436,7 @@
use_paging = True
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.RepositoryMetadata ) \
- .join( model.Repository.table ) \
- .filter( model.Repository.table.c.deprecated == False )
+ .join( model.Repository.table )
class AdminController( BaseUIController, Admin ):
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -39,7 +39,8 @@
if category.repositories:
viewable_repositories = 0
for rca in category.repositories:
- viewable_repositories += 1
+ if not rca.repository.deleted and not rca.repository.deprecated:
+ viewable_repositories += 1
return viewable_repositories
return 0
title = "Categories"
@@ -191,8 +192,6 @@
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- grids.CommunityRatingColumn( "Average Rating", key="rating" ),
- EmailAlertsColumn( "Alert", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
EmailColumn( "Email",
model_class=model.User,
@@ -201,11 +200,7 @@
RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name, description",
cols_to_filter=[ columns[0], columns[1] ],
@@ -222,7 +217,9 @@
preserve_state = False
use_paging = True
def build_initial_query( self, trans, **kwd ):
- return trans.sa_session.query( self.model_class ) \
+ return trans.sa_session.query( model.Repository ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -257,7 +254,8 @@
async_compatible=False ) ]
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.user_id == trans.user.id ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -283,7 +281,8 @@
filterable="standard" ) )
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( and_( model.Repository.table.c.user_id == trans.user.id,
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id,
model.Repository.table.c.deprecated == True ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -341,11 +340,7 @@
RepositoryGrid.RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name",
cols_to_filter=[ columns[0] ],
@@ -375,7 +370,8 @@
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
# Return an empty query.
- return []
+ return trans.sa_session.query( model.Repository ) \
+ .filter( model.Repository.table.c.id < 0 )
class ValidRepositoryGrid( RepositoryGrid ):
# This grid filters out repositories that have been marked as deprecated.
@@ -435,7 +431,8 @@
if 'id' in kwd:
# The user is browsing categories of valid repositories, so filter the request by the received id, which is a category id.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.join( model.RepositoryCategoryAssociation.table ) \
@@ -444,7 +441,8 @@
model.RepositoryMetadata.table.c.downloadable == True ) )
# The user performed a free text search on the ValidCategoryGrid.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -501,7 +499,8 @@
changeset_revision ) )
return trans.sa_session.query( model.RepositoryMetadata ) \
.join( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.filter( or_( *clause_list ) ) \
.order_by( model.Repository.name )
@@ -556,7 +555,8 @@
# RepositoryGrid on the grid.mako template for the CategoryGrid. See ~/templates/webapps/community/category/grid.mako. Since we
# are searching repositories and not categories, redirect to browse_repositories().
if 'id' in kwd and 'f-free-text-search' in kwd and kwd[ 'id' ] == kwd[ 'f-free-text-search' ]:
- # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository id to pass on.
+ # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository
+ # id to pass on.
try:
repository = get_repository_by_name( trans, kwd[ 'id' ] )
kwd[ 'id' ] = trans.security.encode_id( repository.id )
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/webapps/community/model/mapping.py
--- a/lib/galaxy/webapps/community/model/mapping.py
+++ b/lib/galaxy/webapps/community/model/mapping.py
@@ -222,7 +222,7 @@
properties=dict( repositories=relation( RepositoryCategoryAssociation,
secondary=Repository.table,
primaryjoin=( Category.table.c.id == RepositoryCategoryAssociation.table.c.category_id ),
- secondaryjoin=( ( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) & ( Repository.table.c.deprecated == False ) ) ) ) )
+ secondaryjoin=( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) ) ) )
assign_mapper( context, Repository, Repository.table,
properties = dict(
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/f56ec1622b27/
changeset: f56ec1622b27
user: greg
date: 2012-10-24 16:24:11
summary: Fix for the Category.repositories mapper.
affected #: 1 file
diff -r 25ff502e7faa7e6030545a3035b5e98ee584ab5a -r f56ec1622b278da8d759805e70d3a7ac61596328 lib/galaxy/webapps/community/model/mapping.py
--- a/lib/galaxy/webapps/community/model/mapping.py
+++ b/lib/galaxy/webapps/community/model/mapping.py
@@ -222,7 +222,7 @@
properties=dict( repositories=relation( RepositoryCategoryAssociation,
secondary=Repository.table,
primaryjoin=( Category.table.c.id == RepositoryCategoryAssociation.table.c.category_id ),
- secondaryjoin=( ( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) & ( Repository.table.c.deprecated == False ) ) ) ) )
+ secondaryjoin=( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) ) ) )
assign_mapper( context, Repository, Repository.table,
properties = dict(
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Grid cleanup in the tool shed repository controller.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/25ff502e7faa/
changeset: 25ff502e7faa
user: greg
date: 2012-10-24 16:23:08
summary: Grid cleanup in the tool shed repository controller.
affected #: 1 file
diff -r 2350e6e0ca9c6ad885de9fac9d196dc07c45d074 -r 25ff502e7faa7e6030545a3035b5e98ee584ab5a lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -39,7 +39,8 @@
if category.repositories:
viewable_repositories = 0
for rca in category.repositories:
- viewable_repositories += 1
+ if not rca.repository.deleted and not rca.repository.deprecated:
+ viewable_repositories += 1
return viewable_repositories
return 0
title = "Categories"
@@ -191,8 +192,6 @@
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- grids.CommunityRatingColumn( "Average Rating", key="rating" ),
- EmailAlertsColumn( "Alert", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
EmailColumn( "Email",
model_class=model.User,
@@ -201,11 +200,7 @@
RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name, description",
cols_to_filter=[ columns[0], columns[1] ],
@@ -222,7 +217,9 @@
preserve_state = False
use_paging = True
def build_initial_query( self, trans, **kwd ):
- return trans.sa_session.query( self.model_class ) \
+ return trans.sa_session.query( model.Repository ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -257,7 +254,8 @@
async_compatible=False ) ]
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.user_id == trans.user.id ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -283,7 +281,8 @@
filterable="standard" ) )
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( and_( model.Repository.table.c.user_id == trans.user.id,
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id,
model.Repository.table.c.deprecated == True ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -341,11 +340,7 @@
RepositoryGrid.RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name",
cols_to_filter=[ columns[0] ],
@@ -375,7 +370,8 @@
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
# Return an empty query.
- return []
+ return trans.sa_session.query( model.Repository ) \
+ .filter( model.Repository.table.c.id < 0 )
class ValidRepositoryGrid( RepositoryGrid ):
# This grid filters out repositories that have been marked as deprecated.
@@ -435,7 +431,8 @@
if 'id' in kwd:
# The user is browsing categories of valid repositories, so filter the request by the received id, which is a category id.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.join( model.RepositoryCategoryAssociation.table ) \
@@ -444,7 +441,8 @@
model.RepositoryMetadata.table.c.downloadable == True ) )
# The user performed a free text search on the ValidCategoryGrid.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -501,7 +499,8 @@
changeset_revision ) )
return trans.sa_session.query( model.RepositoryMetadata ) \
.join( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.filter( or_( *clause_list ) ) \
.order_by( model.Repository.name )
@@ -556,7 +555,8 @@
# RepositoryGrid on the grid.mako template for the CategoryGrid. See ~/templates/webapps/community/category/grid.mako. Since we
# are searching repositories and not categories, redirect to browse_repositories().
if 'id' in kwd and 'f-free-text-search' in kwd and kwd[ 'id' ] == kwd[ 'f-free-text-search' ]:
- # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository id to pass on.
+ # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository
+ # id to pass on.
try:
repository = get_repository_by_name( trans, kwd[ 'id' ] )
kwd[ 'id' ] = trans.security.encode_id( repository.id )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Grid cleanup in the tool shed admin controller.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/2350e6e0ca9c/
changeset: 2350e6e0ca9c
user: greg
date: 2012-10-24 16:21:45
summary: Grid cleanup in the tool shed admin controller.
affected #: 1 file
diff -r 25a93da96e03431557e1418fd4cf70ab7b6ec427 -r 2350e6e0ca9c6ad885de9fac9d196dc07c45d074 lib/galaxy/webapps/community/controllers/admin.py
--- a/lib/galaxy/webapps/community/controllers/admin.py
+++ b/lib/galaxy/webapps/community/controllers/admin.py
@@ -292,28 +292,25 @@
]
class AdminRepositoryGrid( RepositoryGrid ):
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository ):
+ if repository.deleted:
+ return 'yes'
+ return ''
columns = [ RepositoryGrid.NameColumn( "Name",
key="name",
link=( lambda item: dict( operation="view_or_manage_repository", id=item.id ) ),
attach_popup=True ),
- RepositoryGrid.DescriptionColumn( "Synopsis",
- key="description",
- attach_popup=False ),
- RepositoryGrid.MetadataRevisionColumn( "Metadata Revisions" ),
RepositoryGrid.UserColumn( "Owner",
model_class=model.User,
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- RepositoryGrid.EmailAlertsColumn( "Alert", attach_popup=False ),
- RepositoryGrid.DeprecatedColumn( "Deprecated", attach_popup=False ),
+ RepositoryGrid.DeprecatedColumn( "Deprecated", key="deprecated", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" ) ]
- columns.append( grids.MulticolFilterColumn( "Search repository name, description",
- cols_to_filter=[ columns[0], columns[1] ],
+ DeletedColumn( "Deleted", key="deleted", attach_popup=False ) ]
+ columns.append( grids.MulticolFilterColumn( "Search repository name",
+ cols_to_filter=[ columns[0] ],
key="free-text-search",
visible=False,
filterable="standard" ) )
@@ -327,6 +324,10 @@
condition=( lambda item: item.deleted ),
async_compatible=False ) )
standard_filters = []
+ default_filter = {}
+ def build_initial_query( self, trans, **kwd ):
+ return trans.sa_session.query( model.Repository ) \
+ .join( model.User.table )
class RepositoryMetadataGrid( grids.Grid ):
class IdColumn( grids.IntegerColumn ):
@@ -335,6 +336,9 @@
class NameColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
return repository_metadata.repository.name
+ class OwnerColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ return repository_metadata.repository.user.username
class RevisionColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
repository = repository_metadata.repository
@@ -343,42 +347,60 @@
return "%s:%s" % ( str( ctx.rev() ), repository_metadata.changeset_revision )
class ToolsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- tools_str = ''
+ tools_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'tools' in metadata:
- for tool_metadata_dict in metadata[ 'tools' ]:
- tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ # We used to display the following, but grid was too cluttered.
+ #for tool_metadata_dict in metadata[ 'tools' ]:
+ # tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ return '%d' % len( metadata[ 'tools' ] )
return tools_str
class DatatypesColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- datatypes_str = ''
+ datatypes_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'datatypes' in metadata:
- for datatype_metadata_dict in metadata[ 'datatypes' ]:
- datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ # We used to display the following, but grid was too cluttered.
+ #for datatype_metadata_dict in metadata[ 'datatypes' ]:
+ # datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ return '%d' % len( metadata[ 'datatypes' ] )
return datatypes_str
class WorkflowsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- workflows_str = ''
+ workflows_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'workflows' in metadata:
- workflows_str += '<b>Workflows:</b><br/>'
+ # We used to display the following, but grid was too cluttered.
+ #workflows_str += '<b>Workflows:</b><br/>'
# metadata[ 'workflows' ] is a list of tuples where each contained tuple is
# [ <relative path to the .ga file in the repository>, <exported workflow dict> ]
- workflow_tups = metadata[ 'workflows' ]
- workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
- for workflow_metadata_dict in workflow_metadata_dicts:
- workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ #workflow_tups = metadata[ 'workflows' ]
+ #workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
+ #for workflow_metadata_dict in workflow_metadata_dicts:
+ # workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ return '%d' % len( metadata[ 'workflows' ] )
return workflows_str
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deleted:
+ return 'yes'
+ return ''
+ class DeprecatedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deprecated:
+ return 'yes'
+ return ''
class MaliciousColumn( grids.BooleanColumn ):
def get_value( self, trans, grid, repository_metadata ):
- return repository_metadata.malicious
+ if repository_metadata.malicious:
+ return 'yes'
+ return ''
# Grid definition
title = "Repository Metadata"
model_class = model.RepositoryMetadata
@@ -393,16 +415,14 @@
model_class=model.Repository,
link=( lambda item: dict( operation="view_or_manage_repository_revision", id=item.id ) ),
attach_popup=True ),
- RevisionColumn( "Revision",
- attach_popup=False ),
- ToolsColumn( "Tools",
- attach_popup=False ),
- DatatypesColumn( "Datatypes",
- attach_popup=False ),
- WorkflowsColumn( "Workflows",
- attach_popup=False ),
- MaliciousColumn( "Malicious",
- attach_popup=False )
+ OwnerColumn( "Owner", attach_popup=False ),
+ RevisionColumn( "Revision", attach_popup=False ),
+ ToolsColumn( "Tools", attach_popup=False ),
+ DatatypesColumn( "Datatypes", attach_popup=False ),
+ WorkflowsColumn( "Workflows", attach_popup=False ),
+ DeletedColumn( "Deleted", attach_popup=False ),
+ DeprecatedColumn( "Deprecated", attach_popup=False ),
+ MaliciousColumn( "Malicious", attach_popup=False )
]
operations = [ grids.GridOperation( "Delete",
allow_multiple=False,
@@ -416,8 +436,7 @@
use_paging = True
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.RepositoryMetadata ) \
- .join( model.Repository.table ) \
- .filter( model.Repository.table.c.deprecated == False )
+ .join( model.Repository.table )
class AdminController( BaseUIController, Admin ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Override the grid's TextColumn.get_single_filter() method for the BooleanColumn so the BooleanColumn can be used in grids.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/25a93da96e03/
changeset: 25a93da96e03
user: greg
date: 2012-10-24 15:51:17
summary: Override the grid's TextColumn.get_single_filter() method for the BooleanColumn so the BooleanColumn can be used in grids.
affected #: 1 file
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r 25a93da96e03431557e1418fd4cf70ab7b6ec427 lib/galaxy/web/framework/helpers/grids.py
--- a/lib/galaxy/web/framework/helpers/grids.py
+++ b/lib/galaxy/web/framework/helpers/grids.py
@@ -418,6 +418,13 @@
def sort( self, trans, query, ascending, column_name=None ):
"""Sort query using this column."""
return GridColumn.sort( self, trans, query, ascending, column_name=column_name )
+ def get_single_filter( self, user, a_filter ):
+ if self.key.find( '.' ) > -1:
+ a_key = self.key.split( '.' )[1]
+ else:
+ a_key = self.key
+ model_class_key_field = getattr( self.model_class, a_key )
+ return model_class_key_field == a_filter
class IntegerColumn( TextColumn ):
"""
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Handle weak Python 2.5 unicode string handling in the tool shed.
by Bitbucket 23 Oct '12
by Bitbucket 23 Oct '12
23 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/33db1925e701/
changeset: 33db1925e701
user: greg
date: 2012-10-23 22:52:25
summary: Handle weak Python 2.5 unicode string handling in the tool shed.
affected #: 2 files
diff -r 70ae646e6892e7bb5a05500c99b4cfcc22495a37 -r 33db1925e701664785735160f3738dfde97fc2e6 lib/galaxy/tool_shed/install_manager.py
--- a/lib/galaxy/tool_shed/install_manager.py
+++ b/lib/galaxy/tool_shed/install_manager.py
@@ -50,7 +50,7 @@
break
if found:
break
- full_path = os.path.abspath( os.path.join( root, name ) )
+ full_path = str( os.path.abspath( os.path.join( root, name ) ) )
tool = self.toolbox.load_tool( full_path )
return generate_tool_guid( repository_clone_url, tool )
def get_proprietary_tool_panel_elems( self, latest_tool_migration_script_number ):
diff -r 70ae646e6892e7bb5a05500c99b4cfcc22495a37 -r 33db1925e701664785735160f3738dfde97fc2e6 lib/galaxy/util/shed_util.py
--- a/lib/galaxy/util/shed_util.py
+++ b/lib/galaxy/util/shed_util.py
@@ -685,7 +685,7 @@
metadata_dict[ 'readme' ] = relative_path_to_readme
# See if we have a tool config.
elif name not in NOT_TOOL_CONFIGS and name.endswith( '.xml' ):
- full_path = os.path.abspath( os.path.join( root, name ) )
+ full_path = str( os.path.abspath( os.path.join( root, name ) ) )
if os.path.getsize( full_path ) > 0:
if not ( check_binary( full_path ) or check_image( full_path ) or check_gzip( full_path )[ 0 ]
or check_bz2( full_path )[ 0 ] or check_zip( full_path ) ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Eliminate the problematic Repository.ReviewedRevisions mapper from the tool shed model.
by Bitbucket 23 Oct '12
by Bitbucket 23 Oct '12
23 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/70ae646e6892/
changeset: 70ae646e6892
user: greg
date: 2012-10-23 22:20:05
summary: Eliminate the problematic Repository.ReviewedRevisions mapper from the tool shed model.
affected #: 3 files
diff -r d6ce5e2a1cdb8645f3490043b151f2b271f4b2b4 -r 70ae646e6892e7bb5a05500c99b4cfcc22495a37 lib/galaxy/webapps/community/controllers/common.py
--- a/lib/galaxy/webapps/community/controllers/common.py
+++ b/lib/galaxy/webapps/community/controllers/common.py
@@ -156,11 +156,9 @@
def changeset_revision_reviewed_by_user( trans, user, repository, changeset_revision ):
"""Determine if the current changeset revision has been reviewed by the current user."""
changeset_revision_reviewed_by_user = False
- for reviewed_revision in repository.reviewed_revisions:
- if reviewed_revision.changeset_revision == changeset_revision:
- for review in repository.reviews:
- if review.changeset_revision == changeset_revision and review.user == user:
- return True
+ for review in repository.reviews:
+ if review.changeset_revision == changeset_revision and review.user == user:
+ return True
return False
def check_file_contents( trans ):
# See if any admin users have chosen to receive email alerts when a repository is updated.
@@ -505,13 +503,15 @@
def get_previous_repository_reviews( trans, repository, changeset_revision ):
"""Return an ordered dictionary of repository reviews up to and including the received changeset revision."""
repo = hg.repository( get_configured_ui(), repository.repo_path )
- reviewed_revision_hashes = [ reviewed_revisions.changeset_revision for reviewed_revisions in repository.reviewed_revisions ]
+ reviewed_revision_hashes = [ review.changeset_revision for review in repository.reviews ]
previous_reviews_dict = odict()
for changeset in reversed_upper_bounded_changelog( repo, changeset_revision ):
previous_changeset_revision = str( repo.changectx( changeset ) )
if previous_changeset_revision in reviewed_revision_hashes:
previous_rev, previous_changeset_revision_label = get_rev_label_from_changeset_revision( repo, previous_changeset_revision )
- revision_reviews = get_reviews_by_repository_id_changeset_revision( trans, trans.security.encode_id( repository.id ), previous_changeset_revision )
+ revision_reviews = get_reviews_by_repository_id_changeset_revision( trans,
+ trans.security.encode_id( repository.id ),
+ previous_changeset_revision )
previous_reviews_dict[ previous_changeset_revision ] = dict( changeset_revision_label=previous_changeset_revision_label,
reviews=revision_reviews )
return previous_reviews_dict
@@ -701,7 +701,7 @@
def has_previous_repository_reviews( trans, repository, changeset_revision ):
"""Determine if a repository has a changeset revision review prior to the received changeset revision."""
repo = hg.repository( get_configured_ui(), repository.repo_path )
- reviewed_revision_hashes = [ reviewed_revisions.changeset_revision for reviewed_revisions in repository.reviewed_revisions ]
+ reviewed_revision_hashes = [ review.changeset_revision for review in repository.reviews ]
for changeset in reversed_upper_bounded_changelog( repo, changeset_revision ):
previous_changeset_revision = str( repo.changectx( changeset ) )
if previous_changeset_revision in reviewed_revision_hashes:
diff -r d6ce5e2a1cdb8645f3490043b151f2b271f4b2b4 -r 70ae646e6892e7bb5a05500c99b4cfcc22495a37 lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -1530,7 +1530,7 @@
if current_user:
# See if the current user owns any repositories that have been reviewed.
for repository in current_user.active_repositories:
- if repository.reviewed_revisions:
+ if repository.reviews:
has_reviewed_repositories = True
break
# See if the current user has any repositories that have been marked as deprecated.
diff -r d6ce5e2a1cdb8645f3490043b151f2b271f4b2b4 -r 70ae646e6892e7bb5a05500c99b4cfcc22495a37 lib/galaxy/webapps/community/model/mapping.py
--- a/lib/galaxy/webapps/community/model/mapping.py
+++ b/lib/galaxy/webapps/community/model/mapping.py
@@ -239,12 +239,7 @@
reviewers=relation( User,
secondary=RepositoryReview.table,
primaryjoin=( Repository.table.c.id == RepositoryReview.table.c.repository_id ),
- secondaryjoin=( RepositoryReview.table.c.user_id == User.table.c.id ) ),
- reviewed_revisions=relation( RepositoryMetadata,
- secondary=RepositoryReview.table,
- foreign_keys=[ RepositoryMetadata.table.c.repository_id, RepositoryMetadata.table.c.changeset_revision ],
- primaryjoin=( Repository.table.c.id == RepositoryMetadata.table.c.repository_id ),
- secondaryjoin=( ( RepositoryMetadata.table.c.repository_id == RepositoryReview.table.c.repository_id ) & ( RepositoryMetadata.table.c.changeset_revision == RepositoryReview.table.c.changeset_revision ) ) ) ) )
+ secondaryjoin=( RepositoryReview.table.c.user_id == User.table.c.id ) ) ) )
assign_mapper( context, RepositoryMetadata, RepositoryMetadata.table,
properties=dict( repository=relation( Repository ),
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

23 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/d6ce5e2a1cdb/
changeset: d6ce5e2a1cdb
user: jgoecks
date: 2012-10-23 21:41:48
summary: Circster, enhancements for chords: (a) adjust chords when datasets added/removed from visualization; (b) use config to color; and (c) enable dynamic addition of chord/interaction datasets.
affected #: 3 files
diff -r f23c2b0ef005f1e44299d11fd8fbe5077fe850f0 -r d6ce5e2a1cdb8645f3490043b151f2b271f4b2b4 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -19,6 +19,7 @@
from paste.httpexceptions import *
from galaxy.exceptions import *
from galaxy.model import NoConverterException, ConverterDependencyException
+from galaxy.datatypes.interval import ChromatinInteractions
from Cheetah.Template import Template
@@ -602,7 +603,7 @@
return visualization
- def _get_genome_data( self, trans, dataset, dbkey=None, source='index' ):
+ def _get_genome_data( self, trans, dataset, dbkey=None ):
"""
Returns genome-wide data for dataset if available; if not, message is returned.
"""
@@ -621,6 +622,11 @@
if message:
rval = message
else:
+ # HACK: chromatin interactions tracks use data as source.
+ source = 'index'
+ if isinstance( dataset.datatype, ChromatinInteractions ):
+ source = 'data'
+
data_provider = trans.app.data_provider_registry.get_data_provider( trans,
original_dataset=dataset,
source=source )
diff -r f23c2b0ef005f1e44299d11fd8fbe5077fe850f0 -r d6ce5e2a1cdb8645f3490043b151f2b271f4b2b4 lib/galaxy/webapps/galaxy/controllers/visualization.py
--- a/lib/galaxy/webapps/galaxy/controllers/visualization.py
+++ b/lib/galaxy/webapps/galaxy/controllers/visualization.py
@@ -8,7 +8,6 @@
from galaxy.visualization.genomes import decode_dbkey
from galaxy.visualization.genome.visual_analytics import get_dataset_job
from galaxy.visualization.data_providers.phyloviz import PhylovizDataProvider
-from galaxy.datatypes.interval import ChromatinInteractions
from .library import LibraryListGrid
@@ -753,12 +752,8 @@
tracks = viz_config.get( 'tracks', [] )
for track in tracks:
dataset = self.get_hda_or_ldda( trans, track[ 'hda_ldda'], track[ 'dataset_id' ] )
- # HACK: chromatin interactions tracks use data as source.
- source = 'index'
- if isinstance( dataset.datatype, ChromatinInteractions ):
- source = 'data'
- genome_data = self._get_genome_data( trans, dataset, dbkey, source=source )
+ genome_data = self._get_genome_data( trans, dataset, dbkey )
if not isinstance( genome_data, str ):
track[ 'preloaded_data' ] = genome_data
diff -r f23c2b0ef005f1e44299d11fd8fbe5077fe850f0 -r d6ce5e2a1cdb8645f3490043b151f2b271f4b2b4 static/scripts/viz/circster.js
--- a/static/scripts/viz/circster.js
+++ b/static/scripts/viz/circster.js
@@ -200,31 +200,53 @@
* Render a single track on the outside of the current visualization.
*/
add_track: function(new_track) {
- // Recompute and update track bounds.
- var new_track_bounds = this.get_tracks_bounds();
- _.each(this.circular_views, function(track_view, i) {
- track_view.update_radius_bounds(new_track_bounds[i]);
- });
+ if (new_track.get('track_type') === 'DiagonalHeatmapTrack') {
+ // Added chords track.
+ var innermost_radius_bounds = this.circular_views[0].radius_bounds,
+ new_view = new CircsterChromInteractionsTrackView({
+ el: d3.select('g.tracks').append('g')[0],
+ track: new_track,
+ radius_bounds: innermost_radius_bounds,
+ genome: this.genome,
+ total_gap: this.total_gap
+ });
+ new_view.render();
+ this.chords_views.push(new_view);
+ }
+ else {
+ // Added circular track.
- // Render new track.
- var track_index = this.circular_views.length,
- track_view_class = (new_track.get('track_type') === 'LineTrack' ?
- CircsterBigWigTrackView :
- CircsterSummaryTreeTrackView ),
- track_view = new track_view_class({
- el: d3.select('g.tracks').append('g')[0],
- track: new_track,
- radius_bounds: new_track_bounds[track_index],
- genome: this.genome,
- total_gap: this.total_gap
+ // Recompute and update track bounds.
+ var new_track_bounds = this.get_tracks_bounds();
+ _.each(this.circular_views, function(track_view, i) {
+ track_view.update_radius_bounds(new_track_bounds[i]);
});
- track_view.render();
- this.circular_views.push(track_view);
- // Update label track.
- var track_bounds = new_track_bounds[ new_track_bounds.length-1 ];
- track_bounds[1] = track_bounds[0];
- this.label_track_view.update_radius_bounds(track_bounds);
+ // Update chords tracks.
+ _.each(this.chords_views, function(track_view) {
+ track_view.update_radius_bounds(new_track_bounds[0]);
+ });
+
+ // Render new track.
+ var track_index = this.circular_views.length,
+ track_view_class = (new_track.get('track_type') === 'LineTrack' ?
+ CircsterBigWigTrackView :
+ CircsterSummaryTreeTrackView ),
+ track_view = new track_view_class({
+ el: d3.select('g.tracks').append('g')[0],
+ track: new_track,
+ radius_bounds: new_track_bounds[track_index],
+ genome: this.genome,
+ total_gap: this.total_gap
+ });
+ track_view.render();
+ this.circular_views.push(track_view);
+
+ // Update label track.
+ var track_bounds = new_track_bounds[ new_track_bounds.length-1 ];
+ track_bounds[1] = track_bounds[0];
+ this.label_track_view.update_radius_bounds(track_bounds);
+ }
},
/**
@@ -269,6 +291,15 @@
},
/**
+ * Get fill color from config.
+ */
+ get_fill_color: function() {
+ var color = this.track.get('config').get_value('block_color');
+ if (!color) { color = this.track.get('config').get_value('color'); }
+ return color;
+ },
+
+ /**
* Render track's data by adding SVG elements to parent.
*/
render: function() {
@@ -382,10 +413,8 @@
});
// Add new data path and apply preferences.
- var prefs = self.track.get('prefs'),
- block_color = prefs.block_color;
- if (!block_color) { block_color = prefs.color; }
- self._render_chrom_data(self.parent_elt, chrom_arc, data).style('stroke', block_color).style('fill', block_color);
+ var color = self.get_fill_color();
+ self._render_chrom_data(self.parent_elt, chrom_arc, data).style('stroke', color).style('fill', color);
});
});
@@ -463,10 +492,8 @@
});
// Apply prefs to all track data.
- var config = track.get('config'),
- block_color = config.get_value('block_color');
- if (!block_color) { block_color = config.get_value('color'); }
- self.parent_elt.selectAll('path.chrom-data').style('stroke', block_color).style('fill', block_color);
+ var color = self.get_fill_color();
+ self.parent_elt.selectAll('path.chrom-data').style('stroke', color).style('fill', color);
rendered_deferred.resolve(svg);
});
@@ -653,7 +680,7 @@
// When data is ready, render track.
$.when(self.track.get('data_manager').data_is_ready()).then(function() {
- // Convert genome-wide data in chord data.
+ // When data has been fetched, render track.
$.when(self.track.get('data_manager').get_genome_wide_data(self.genome)).then(function(genome_wide_data) {
var chord_data = [],
chroms_info = self.genome.get_chroms_info();
@@ -685,13 +712,18 @@
.selectAll("path")
.data(chord_data)
.enter().append("path")
- .style("fill", '000')
+ .style("fill", self.get_fill_color())
.attr("d", d3.svg.chord().radius(self.radius_bounds[0]))
.style("opacity", 1);
});
});
},
+ update_radius_bounds: function(radius_bounds) {
+ this.radius_bounds = radius_bounds;
+ this.parent_elt.selectAll("path").transition().attr("d", d3.svg.chord().radius(this.radius_bounds[0]));
+ },
+
/**
* Returns radians for a genomic position.
*/
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0

commit/galaxy-central: greg: Clarify the various url protocols that are supported by the tool shed repository upload component.
by Bitbucket 23 Oct '12
by Bitbucket 23 Oct '12
23 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/f23c2b0ef005/
changeset: f23c2b0ef005
user: greg
date: 2012-10-23 21:23:51
summary: Clarify the various url protocols that are supported by the tool shed repository upload component.
affected #: 1 file
diff -r bad5602cdf998dff9c2aecf478dc4ed1a797a023 -r f23c2b0ef005f1e44299d11fd8fbe5077fe850f0 templates/webapps/community/repository/upload.mako
--- a/templates/webapps/community/repository/upload.mako
+++ b/templates/webapps/community/repository/upload.mako
@@ -85,10 +85,10 @@
<input name="url" type="textfield" value="${url | h}" size="40"/></div><div class="toolParamHelp" style="clear: both;">
- Enter a URL to upload your files via http. URLs that point to mercurial repositories (URLs that start with hg:// or hgs://)
- are allowed. This mechanism results in the tip revision of an external mercurial repository being added to the tool shed
- repository as a single new changeset. The revision history of the originating external mercurial repository is not uploaded
- to the tool shed repository.
+ Enter a url to upload your files. In addition to http and ftp urls, urls that point to mercurial repositories (urls that start
+ with hg:// or hgs://) are allowed. This mechanism results in the tip revision of an external mercurial repository being added
+ to the tool shed repository as a single new changeset. The revision history of the originating external mercurial repository is
+ not uploaded to the tool shed repository.
</div><div style="clear: both"></div></div>
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0