galaxy-commits
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
October 2012
- 1 participants
- 194 discussions
commit/galaxy-central: greg: Make sure repository component reviews that have been marked private are restricted to those that are authorized to access them.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/a5db4601ddbe/
changeset: a5db4601ddbe
user: greg
date: 2012-10-24 17:28:27
summary: Make sure repository component reviews that have been marked private are restricted to those that are authorized to access them.
affected #: 1 file
diff -r ae5b674b4342a87863a17cea8ad3a46adef2b51f -r a5db4601ddbe117b5eb393f5d92982e50a866c7d templates/webapps/community/repository_review/browse_review.mako
--- a/templates/webapps/community/repository_review/browse_review.mako
+++ b/templates/webapps/community/repository_review/browse_review.mako
@@ -61,15 +61,15 @@
<table class="grid">
%for component_review in review.component_reviews:
<%
+ can_browse = trans.app.security_agent.user_can_browse_component_review( component_review, trans.user )
component = component_review.component
-
- # Initialize Private check box.
- private_check_box_name = '%s%sprivate' % ( component.name, STRSEP )
- private_check_box = CheckboxField( name=private_check_box_name, checked=component_review.private )
-
- # Initialize star rating.
- rating_name = '%s%srating' % ( component.name, STRSEP )
-
+ if can_browse:
+ # Initialize Private check box.
+ private_check_box_name = '%s%sprivate' % ( component.name, STRSEP )
+ private_check_box = CheckboxField( name=private_check_box_name, checked=component_review.private )
+
+ # Initialize star rating.
+ rating_name = '%s%srating' % ( component.name, STRSEP )
%><tr><td bgcolor="#D8D8D8"><b>${component.name | h}</b></td>
@@ -77,41 +77,45 @@
</tr><tr><td colspan="2">
- <table class="grid">
- <tr>
- <td>
- <label>Private:</label>
- ${private_check_box.get_html( disabled=True )}
- <div class="toolParamHelp" style="clear: both;">
- A private review can be accessed only by the owner of the repository and authorized repository reviewers.
- </div>
- <div style="clear: both"></div>
- </td>
- </tr>
- %if component_review.comment:
+ %if can_browse:
+ <table class="grid"><tr><td>
- <div overflow-wrap:normal;overflow:hidden;word-break:keep-all;word-wrap:break-word;line-break:strict;>
- ${ escape_html_add_breaks( component_review.comment ) }
+ <label>Private:</label>
+ ${private_check_box.get_html( disabled=True )}
+ <div class="toolParamHelp" style="clear: both;">
+ A private review can be accessed only by the owner of the repository and authorized repository reviewers.
</div>
+ <div style="clear: both"></div></td></tr>
- %endif
- <tr>
- <td>
- <label>Approved:</label>
- ${component_review.approved | h}
- <div style="clear: both"></div>
- </td>
- </tr>
- <tr>
- <td>
- <label>Rating:</label>
- ${render_star_rating( rating_name, component_review.rating, disabled=True )}
- <div style="clear: both"></div>
- </td>
- </tr>
- </table>
+ %if component_review.comment:
+ <tr>
+ <td>
+ <div overflow-wrap:normal;overflow:hidden;word-break:keep-all;word-wrap:break-word;line-break:strict;>
+ ${ escape_html_add_breaks( component_review.comment ) }
+ </div>
+ </td>
+ </tr>
+ %endif
+ <tr>
+ <td>
+ <label>Approved:</label>
+ ${component_review.approved | h}
+ <div style="clear: both"></div>
+ </td>
+ </tr>
+ <tr>
+ <td>
+ <label>Rating:</label>
+ ${render_star_rating( rating_name, component_review.rating, disabled=True )}
+ <div style="clear: both"></div>
+ </td>
+ </tr>
+ </table>
+ %else:
+ You are not authorized to access the review of this component since it has been marked private.
+ %endif
</td></tr>
%endfor
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Fix for determing a Checkbox setting when reviewing repositories.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/ae5b674b4342/
changeset: ae5b674b4342
user: greg
date: 2012-10-24 17:27:16
summary: Fix for determing a Checkbox setting when reviewing repositories.
affected #: 1 file
diff -r a239f666d5ba8ce3f24198e7054fdd03ab60ada0 -r ae5b674b4342a87863a17cea8ad3a46adef2b51f lib/galaxy/webapps/community/controllers/repository_review.py
--- a/lib/galaxy/webapps/community/controllers/repository_review.py
+++ b/lib/galaxy/webapps/community/controllers/repository_review.py
@@ -509,7 +509,7 @@
elif component_review_attr == 'comment':
comment = str( v )
elif component_review_attr == 'private':
- private = CheckboxField.is_checked( str( v ) )
+ private = CheckboxField.is_checked( v )
elif component_review_attr == 'approved':
approved = str( v )
elif component_review_attr == 'rating':
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Enhance the tool shed security agent to enable restriction of component reviews that have been marked private.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/a239f666d5ba/
changeset: a239f666d5ba
user: greg
date: 2012-10-24 17:26:00
summary: Enhance the tool shed security agent to enable restriction of component reviews that have been marked private.
affected #: 1 file
diff -r 409fec0598f944d30f69e9faaac193b71d6541b8 -r a239f666d5ba8ce3f24198e7054fdd03ab60ada0 lib/galaxy/webapps/community/security/__init__.py
--- a/lib/galaxy/webapps/community/security/__init__.py
+++ b/lib/galaxy/webapps/community/security/__init__.py
@@ -170,6 +170,21 @@
repository_reviewer_role = self.get_repository_reviewer_role()
return repository_reviewer_role and repository_reviewer_role in roles
return False
+ def user_can_browse_component_review( self, component_review, user ):
+ if component_review and user:
+ if component_review.private:
+ if self.user_can_review_repositories( user ):
+ # Reviewers can access private component reviews.
+ return True
+ repository_review = component_review.repository_review
+ repository = repository_review.repository
+ if repository.user == user:
+ # The repository owner can access private component reviews.
+ return True
+ return False
+ # The component_review is not marked private.
+ return True
+ return False
def get_permitted_actions( filter=None ):
'''Utility method to return a subset of RBACAgent's permitted actions'''
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/409fec0598f9/
changeset: 409fec0598f9
user: inithello
date: 2012-10-24 17:09:20
summary: Migrate Picard tools to the tool shed.
affected #: 82 files
Diff too large to display.
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
3 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/489e80f86187/
changeset: 489e80f86187
user: jmchilton
date: 2012-09-26 04:49:22
summary: Ease tool development by reducing clicking/keystorkes required by developers to repeatedly reload the same tool via admin console.
affected #: 2 files
diff -r 20144ac9ffa7d15a9738cee2a5c8cb2b4cf1c401 -r 489e80f8618766c5ea3d655adaaea5e6b13d3bc2 lib/galaxy/web/base/controller.py
--- a/lib/galaxy/web/base/controller.py
+++ b/lib/galaxy/web/base/controller.py
@@ -1578,10 +1578,12 @@
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
toolbox = self.app.toolbox
+ tool_id = None
if params.get( 'reload_tool_button', False ):
tool_id = params.tool_id
message, status = toolbox.reload_tool_by_id( tool_id )
return trans.fill_template( '/admin/reload_tool.mako',
+ tool_id=tool_id,
toolbox=toolbox,
message=message,
status=status )
diff -r 20144ac9ffa7d15a9738cee2a5c8cb2b4cf1c401 -r 489e80f8618766c5ea3d655adaaea5e6b13d3bc2 templates/admin/reload_tool.mako
--- a/templates/admin/reload_tool.mako
+++ b/templates/admin/reload_tool.mako
@@ -1,6 +1,17 @@
<%inherit file="/base.mako"/><%namespace file="/message.mako" import="render_msg" />
+<script type="text/javascript">
+$().ready(function() {
+%if tool_id:
+ var focus_el = $("input[name=reload_tool_button]");
+%else:
+ var focus_el = $("select[name=tool_id]");
+%endif
+ focus_el.focus();
+});
+</script>
+
%if message:
${render_msg( message, status )}
%endif
@@ -22,7 +33,11 @@
<% section = val %>
%for section_key, section_val in section.elems.items():
%if section_key.startswith( 'tool' ):
- <option value="${section_val.id}">${section_val.name}</option>
+ <% selected_str = "" %>
+ %if section_val.id == tool_id:
+ <% selected_str = " selected=\"selected\"" %>
+ %endif
+ <option value="${section_val.id}"${selected_str}>${section_val.name}</option>
%endif
%endfor
%endif
https://bitbucket.org/galaxy/galaxy-central/changeset/a0b1af3a160f/
changeset: a0b1af3a160f
user: jmchilton
date: 2012-09-29 03:15:56
summary: Merge with galaxy-central changes.
affected #: 232 files
Diff too large to display.
https://bitbucket.org/galaxy/galaxy-central/changeset/869bcca8756a/
changeset: 869bcca8756a
user: dannon
date: 2012-10-24 16:54:37
summary: Merged in jmchilton/galaxy-central-reload-tool-improvements (pull request #72)
affected #: 3 files
diff -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f -r 869bcca8756a32b0b2c482f1ea67623cce110051 lib/galaxy/web/base/controllers/admin.py
--- a/lib/galaxy/web/base/controllers/admin.py
+++ b/lib/galaxy/web/base/controllers/admin.py
@@ -56,10 +56,12 @@
message = util.restore_text( params.get( 'message', '' ) )
status = params.get( 'status', 'done' )
toolbox = self.app.toolbox
+ tool_id = None
if params.get( 'reload_tool_button', False ):
tool_id = params.tool_id
message, status = toolbox.reload_tool_by_id( tool_id )
return trans.fill_template( '/admin/reload_tool.mako',
+ tool_id=tool_id,
toolbox=toolbox,
message=message,
status=status )
@@ -1110,4 +1112,4 @@
# Load user from database
id = trans.security.decode_id( id )
quota = trans.sa_session.query( trans.model.Quota ).get( id )
- return quota
\ No newline at end of file
+ return quota
diff -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f -r 869bcca8756a32b0b2c482f1ea67623cce110051 templates/admin/reload_tool.mako
--- a/templates/admin/reload_tool.mako
+++ b/templates/admin/reload_tool.mako
@@ -1,6 +1,17 @@
<%inherit file="/base.mako"/><%namespace file="/message.mako" import="render_msg" />
+<script type="text/javascript">
+$().ready(function() {
+%if tool_id:
+ var focus_el = $("input[name=reload_tool_button]");
+%else:
+ var focus_el = $("select[name=tool_id]");
+%endif
+ focus_el.focus();
+});
+</script>
+
%if message:
${render_msg( message, status )}
%endif
@@ -22,7 +33,11 @@
<% section = val %>
%for section_key, section_val in section.elems.items():
%if section_key.startswith( 'tool' ):
- <option value="${section_val.id}">${section_val.name}</option>
+ <% selected_str = "" %>
+ %if section_val.id == tool_id:
+ <% selected_str = " selected=\"selected\"" %>
+ %endif
+ <option value="${section_val.id}"${selected_str}>${section_val.name}</option>
%endif
%endfor
%endif
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
11 new commits in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/8220718b1eeb/
changeset: 8220718b1eeb
user: jmchilton
date: 2012-10-08 01:22:49
summary: Fix running tools with multi-data parameters. (Extracting and running workflows still broken.)
affected #: 2 files
diff -r 05fc04a70a3bcbfaeedfcf6f2ec16a7e38fc7c94 -r 8220718b1eebdcb1257a9520f9aa9d23f7b2d3e4 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -2224,6 +2224,13 @@
values = input_values[ input.name ]
current = values["__current_case__"]
wrap_values( input.cases[current].inputs, values )
+ elif isinstance( input, DataToolParameter ) and input.multiple:
+ values = input_values[ input.name ]
+ input_values[ input.name ] = \
+ [DatasetFilenameWrapper( value,
+ datatypes_registry = self.app.datatypes_registry,
+ tool = self,
+ name = input.name ) for value in values]
elif isinstance( input, DataToolParameter ):
## FIXME: We're populating param_dict with conversions when
## wrapping values, this should happen as a separate
diff -r 05fc04a70a3bcbfaeedfcf6f2ec16a7e38fc7c94 -r 8220718b1eebdcb1257a9520f9aa9d23f7b2d3e4 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1545,9 +1545,30 @@
else:
return trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( value )
+
+ # TODO: Determine if this needs to be overridden. -John
+ def value_from_basic( self, value, app, ignore_errors=False ):
+ # HACK: Some things don't deal with unicode well, psycopg problem?
+ if type( value ) == unicode:
+ value = str( value )
+ # Handle Runtime values (valid for any parameter?)
+ if isinstance( value, dict ) and '__class__' in value and value['__class__'] == "RuntimeValue":
+ return RuntimeValue()
+ # Delegate to the 'to_python' method
+ if ignore_errors:
+ try:
+ return self.to_python( value, app )
+ except:
+ return value
+ else:
+ return self.to_python( value, app )
+
+
def to_string( self, value, app ):
if value is None or isinstance( value, str ):
return value
+ elif isinstance( value, list ):
+ return ",".join( [ val if isinstance( val, str ) else str(val.id) for val in value] )
elif isinstance( value, DummyDataset ):
return None
return value.id
@@ -1557,6 +1578,10 @@
# indicates that the dataset is optional, while '' indicates that it is not.
if value is None or value == '' or value == 'None':
return value
+ if isinstance(value, str) and value.find(",") > -1:
+ values = value.split(",")
+ # TODO: Optimize. -John
+ return [app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( val ) ) for val in values]
return app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( value ) )
def to_param_dict_string( self, value, other_values={} ):
https://bitbucket.org/galaxy/galaxy-central/changeset/4a3d4de46132/
changeset: 4a3d4de46132
user: jmchilton
date: 2012-10-08 01:29:54
summary: Fix extracting workflows containing tools with multi-input parameters.
affected #: 1 file
diff -r 8220718b1eebdcb1257a9520f9aa9d23f7b2d3e4 -r 4a3d4de46132de8e353b48640512fdefab7ea7ca lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -2054,7 +2054,11 @@
# still need to clean them up so we can serialize
# if not( prefix ):
if tmp: #this is false for a non-set optional dataset
- associations.append( ( tmp.hid, prefix + key ) )
+ if not isinstance(tmp, list):
+ associations.append( ( tmp.hid, prefix + key ) )
+ else:
+ associations.extend( [ (t.hid, prefix + key) for t in tmp] )
+
# Cleanup the other deprecated crap associated with datasets
# as well. Worse, for nested datasets all the metadata is
# being pushed into the root. FIXME: MUST REMOVE SOON
https://bitbucket.org/galaxy/galaxy-central/changeset/65024c60d545/
changeset: 65024c60d545
user: jmchilton
date: 2012-10-08 02:36:46
summary: Progress toward multi-inputs: initial run workflow input page renders, workflows still do not execute, standard workflows seem to continue to work.
affected #: 3 files
diff -r 4a3d4de46132de8e353b48640512fdefab7ea7ca -r 65024c60d5452cad3d410b41046a472ba3109e41 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1567,10 +1567,12 @@
def to_string( self, value, app ):
if value is None or isinstance( value, str ):
return value
+ elif isinstance( value, DummyDataset ):
+ return None
+ elif isinstance( value, list) and len(value) > 0 and isinstance( value[0], DummyDataset):
+ return None
elif isinstance( value, list ):
return ",".join( [ val if isinstance( val, str ) else str(val.id) for val in value] )
- elif isinstance( value, DummyDataset ):
- return None
return value.id
def to_python( self, value, app ):
diff -r 4a3d4de46132de8e353b48640512fdefab7ea7ca -r 65024c60d5452cad3d410b41046a472ba3109e41 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1362,6 +1362,13 @@
# Connections by input name
step.input_connections_by_name = \
dict( ( conn.input_name, conn ) for conn in step.input_connections )
+ input_connections_by_name = {}
+ for conn in step.input_connections:
+ input_name = conn.input_name
+ if not input_name in input_connections_by_name:
+ input_connections_by_name[input_name] = []
+ input_connections_by_name[input_name].append(conn)
+ step.input_connections_by_name = input_connections_by_name
# Extract just the arguments for this step by prefix
p = "%s|" % step.id
l = len(p)
@@ -1419,10 +1426,15 @@
tool = trans.app.toolbox.get_tool( step.tool_id )
# Connect up
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if prefixed_name in step.input_connections_by_name:
conn = step.input_connections_by_name[ prefixed_name ]
- return outputs[ conn.output_step.id ][ conn.output_name ]
+ if input.multiple:
+ replacements = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
+ else:
+ replacement = outputs[ conn[0].output_step.id ][ conn[0].output_name ]
+ return replacement
try:
visit_input_values( tool.inputs, step.state.inputs, callback )
except KeyError, k:
diff -r 4a3d4de46132de8e353b48640512fdefab7ea7ca -r 65024c60d5452cad3d410b41046a472ba3109e41 lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -336,9 +336,14 @@
# Any connected input needs to have value DummyDataset (these
# are not persisted so we need to do it every time)
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if connections is None or prefixed_name in input_connections_by_name:
- return DummyDataset()
+ if input.multiple:
+ replacement = [DummyDataset() for conn in connections]
+ else:
+ replacement = DummyDataset()
+ return replacement
visit_input_values( self.tool.inputs, self.state.inputs, callback )
class WorkflowModuleFactory( object ):
https://bitbucket.org/galaxy/galaxy-central/changeset/565510047933/
changeset: 565510047933
user: jmchilton
date: 2012-10-08 02:49:02
summary: Cleanup some artifacts from last few multi-input commits.
affected #: 2 files
diff -r 65024c60d5452cad3d410b41046a472ba3109e41 -r 56551004793378fb618fcfc0214669f307dea594 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1545,25 +1545,6 @@
else:
return trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( value )
-
- # TODO: Determine if this needs to be overridden. -John
- def value_from_basic( self, value, app, ignore_errors=False ):
- # HACK: Some things don't deal with unicode well, psycopg problem?
- if type( value ) == unicode:
- value = str( value )
- # Handle Runtime values (valid for any parameter?)
- if isinstance( value, dict ) and '__class__' in value and value['__class__'] == "RuntimeValue":
- return RuntimeValue()
- # Delegate to the 'to_python' method
- if ignore_errors:
- try:
- return self.to_python( value, app )
- except:
- return value
- else:
- return self.to_python( value, app )
-
-
def to_string( self, value, app ):
if value is None or isinstance( value, str ):
return value
diff -r 65024c60d5452cad3d410b41046a472ba3109e41 -r 56551004793378fb618fcfc0214669f307dea594 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1360,8 +1360,6 @@
for step in workflow.steps:
step.upgrade_messages = {}
# Connections by input name
- step.input_connections_by_name = \
- dict( ( conn.input_name, conn ) for conn in step.input_connections )
input_connections_by_name = {}
for conn in step.input_connections:
input_name = conn.input_name
https://bitbucket.org/galaxy/galaxy-central/changeset/709ce65ca6b2/
changeset: 709ce65ca6b2
user: jmchilton
date: 2012-10-08 03:19:40
summary: Fix a typo, now workflows contain multi-dataset inputs run.
affected #: 1 file
diff -r 56551004793378fb618fcfc0214669f307dea594 -r 709ce65ca6b29b15e89a0913546ffd67e1d3c19d lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1429,11 +1429,12 @@
if prefixed_name in step.input_connections_by_name:
conn = step.input_connections_by_name[ prefixed_name ]
if input.multiple:
- replacements = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
+ replacement = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
else:
replacement = outputs[ conn[0].output_step.id ][ conn[0].output_name ]
return replacement
try:
+ # Replace DummyDatasets with historydatasetassociations
visit_input_values( tool.inputs, step.state.inputs, callback )
except KeyError, k:
error( "Error due to input mapping of '%s' in '%s'. A common cause of this is conditional outputs that cannot be determined until runtime, please review your workflow." % (tool.name, k.message))
https://bitbucket.org/galaxy/galaxy-central/changeset/bbeb3f333beb/
changeset: bbeb3f333beb
user: jmchilton
date: 2012-10-12 19:01:06
summary: Fix from Jim Johnson for rerunning jobs with multi-input data parameters.
affected #: 1 file
diff -r 709ce65ca6b29b15e89a0913546ffd67e1d3c19d -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b lib/galaxy/webapps/galaxy/controllers/tool_runner.py
--- a/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
+++ b/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
@@ -199,6 +199,12 @@
if isinstance( value, UnvalidatedValue ):
return str( value )
if isinstance( input, DataToolParameter ):
+ if isinstance(value,list):
+ values = []
+ for val in value:
+ if val not in history.datasets and val in hda_source_dict:
+ values.append( hda_source_dict[ val ])
+ return values
if value not in history.datasets and value in hda_source_dict:
return hda_source_dict[ value ]
visit_input_values( tool.inputs, params_objects, rerun_callback )
https://bitbucket.org/galaxy/galaxy-central/changeset/db6cabd6760c/
changeset: db6cabd6760c
user: jmchilton
date: 2012-10-16 16:13:32
summary: Initial work from JJ on enhancing the workflow editor to allow multiple input data parameters. Workflows containing such connections will now display properly, though they cannot be modified.
affected #: 3 files
diff -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -741,12 +741,14 @@
}
# Connections
input_connections = step.input_connections
+ multiple_input = {} # Boolean value indicating if this can be mutliple
if step.type is None or step.type == 'tool':
# Determine full (prefixed) names of valid input datasets
data_input_names = {}
def callback( input, value, prefixed_name, prefixed_label ):
if isinstance( input, DataToolParameter ):
data_input_names[ prefixed_name ] = True
+ multiple_input[input.name] = input.multiple
visit_input_values( module.tool.inputs, module.state.inputs, callback )
# Filter
# FIXME: this removes connection without displaying a message currently!
@@ -766,8 +768,14 @@
# Encode input connections as dictionary
input_conn_dict = {}
for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ conn_dict = dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ if conn.input_name in multiple_input:
+ if conn.input_name in input_conn_dict:
+ input_conn_dict[ conn.input_name ].append( conn_dict )
+ else:
+ input_conn_dict[ conn.input_name ] = [ conn_dict ]
+ else:
+ input_conn_dict[ conn.input_name ] = conn_dict
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -832,13 +840,15 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
- conn = model.WorkflowStepConnection()
- conn.input_step = step
- conn.input_name = input_name
- conn.output_name = conn_dict['output_name']
- conn.output_step = steps_by_external_id[ conn_dict['id'] ]
+ for input_name, conns in step.temp_input_connections.iteritems():
+ if conns:
+ conn_dicts = conns if isinstance(conns,list) else [conns]
+ for conn_dict in conn_dicts:
+ conn = model.WorkflowStepConnection()
+ conn.input_step = step
+ conn.input_name = input_name
+ conn.output_name = conn_dict['output_name']
+ conn.output_step = steps_by_external_id[ conn_dict['id'] ]
del step.temp_input_connections
# Order the steps if possible
attach_ordered_steps( workflow, steps )
diff -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -263,6 +263,7 @@
data_inputs.append( dict(
name=prefixed_name,
label=prefixed_label,
+ multiple=input.multiple,
extensions=input.extensions ) )
visit_input_values( self.tool.inputs, self.state.inputs, callback )
return data_inputs
@@ -340,7 +341,7 @@
if isinstance( input, DataToolParameter ):
if connections is None or prefixed_name in input_connections_by_name:
if input.multiple:
- replacement = [DummyDataset() for conn in connections]
+ replacement = [] if not connections else [DummyDataset() for conn in connections]
else:
replacement = DummyDataset()
return replacement
diff -r bbeb3f333beb6f09c7a32ef9299c362f999ff79b -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 static/scripts/galaxy.workflow_editor.canvas.js
--- a/static/scripts/galaxy.workflow_editor.canvas.js
+++ b/static/scripts/galaxy.workflow_editor.canvas.js
@@ -34,16 +34,17 @@
OutputTerminal.prototype = new Terminal();
-function InputTerminal( element, datatypes ) {
+function InputTerminal( element, datatypes, multiple ) {
Terminal.call( this, element );
this.datatypes = datatypes;
+ this.multiple = multiple
}
InputTerminal.prototype = new Terminal();
$.extend( InputTerminal.prototype, {
can_accept: function ( other ) {
- if ( this.connectors.length < 1 ) {
+ if ( this.connectors.length < 1 || this.multiple) {
for ( var t in this.datatypes ) {
var cat_outputs = new Array();
cat_outputs = cat_outputs.concat(other.datatypes);
@@ -163,10 +164,10 @@
this.tool_errors = {};
}
$.extend( Node.prototype, {
- enable_input_terminal : function( elements, name, types ) {
+ enable_input_terminal : function( elements, name, types, multiple ) {
var node = this;
$(elements).each( function() {
- var terminal = this.terminal = new InputTerminal( this, types );
+ var terminal = this.terminal = new InputTerminal( this, types, multiple );
terminal.node = node;
terminal.name = name;
$(this).bind( "dropinit", function( e, d ) {
@@ -304,7 +305,7 @@
var ibox = $("<div class='inputs'></div>").appendTo( b );
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
var ib = $("<div class='form-row dataRow input-data-row' name='" + input.name + "'>" + input.label + "</div>" );
ib.css({ position:'absolute',
left: -1000,
@@ -407,7 +408,7 @@
var old = old_body.find( "div.input-data-row");
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
// If already connected save old connection
old_body.find( "div[name='" + input.name + "']" ).each( function() {
$(this).find( ".input-terminal" ).each( function() {
@@ -545,8 +546,10 @@
input_connections[ t.name ] = null;
// There should only be 0 or 1 connectors, so this is
// really a sneaky if statement
+ var cons = []
$.each( t.connectors, function ( i, c ) {
- input_connections[ t.name ] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ cons[i] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ input_connections[ t.name ] = cons;
});
});
var post_job_actions = {};
@@ -617,11 +620,21 @@
var node = wf.nodes[id];
$.each( step.input_connections, function( k, v ) {
if ( v ) {
- var other_node = wf.nodes[ v.id ];
- var c = new Connector();
- c.connect( other_node.output_terminals[ v.output_name ],
- node.input_terminals[ k ] );
- c.redraw();
+ if ($.isArray(v)) {
+ $.each( v, function (l,x ) {
+ var other_node = wf.nodes[ x.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ x.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ });
+ } else {
+ var other_node = wf.nodes[ v.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ v.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ }
}
});
if(using_workflow_outputs && node.type === 'tool'){
https://bitbucket.org/galaxy/galaxy-central/changeset/b9477d8f7050/
changeset: b9477d8f7050
user: jjohnson
date: 2012-10-16 18:47:57
summary: Fix workflow_editor exceptions for multi-input when dragging
affected #: 1 file
diff -r db6cabd6760c09f9a5b49aed7e38e4ef991fee97 -r b9477d8f7050cf90786e749ce096667a7cfc4725 static/scripts/galaxy.workflow_editor.canvas.js
--- a/static/scripts/galaxy.workflow_editor.canvas.js
+++ b/static/scripts/galaxy.workflow_editor.canvas.js
@@ -112,6 +112,9 @@
var relativeTop = function( e ) {
return $(e).offset().top - canvas_container.offset().top;
};
+ if (!this.handle1 || !this.handle2) {
+ return;
+ }
// Find the position of each handle
var start_x = relativeLeft( this.handle1.element ) + 5;
var start_y = relativeTop( this.handle1.element ) + 5;
@@ -175,9 +178,13 @@
// compatible type
return $(d.drag).hasClass( "output-terminal" ) && terminal.can_accept( d.drag.terminal );
}).bind( "dropstart", function( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ }
}).bind( "dropend", function ( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ }
}).bind( "drop", function( e, d ) {
( new Connector( d.drag.terminal, terminal ) ).redraw();
}).bind( "hover", function() {
@@ -191,7 +198,9 @@
$("<div class='buttons'></div>").append(
$("<img/>").attr("src", galaxy_paths.attributes.image_path + '/delete_icon.png').click( function() {
$.each( terminal.connectors, function( _, x ) {
- x.destroy();
+ if (x) {
+ x.destroy();
+ }
});
t.remove();
})))
https://bitbucket.org/galaxy/galaxy-central/changeset/eec42e84733b/
changeset: eec42e84733b
user: jmchilton
date: 2012-10-17 16:53:15
summary: Fix workflow imports/exports for multiple input data parameters.
affected #: 1 file
diff -r b9477d8f7050cf90786e749ce096667a7cfc4725 -r eec42e84733b60c98c388ebf3291636fe0cfb15a lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -1761,9 +1761,22 @@
input_connections = [ conn for conn in input_connections if conn.input_name in data_input_names ]
# Encode input connections as dictionary
input_conn_dict = {}
- for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ unique_input_names = set( [conn.input_name for conn in input_connections] )
+ for input_name in unique_input_names:
+ input_conn_dict[ input_name ] = \
+ [ dict( id=conn.output_step.order_index, output_name=conn.output_name ) for conn in input_connections if conn.input_name == input_name ]
+ # Preserve backward compatability. Previously Galaxy
+ # assumed input connections would be dictionaries not
+ # lists of dictionaries, so replace any singleton list
+ # with just the dictionary so that workflows exported from
+ # newer Galaxy instances can be used with older Galaxy
+ # instances if they do no include multiple input
+ # tools. This should be removed at some point. Mirrored
+ # hack in _workflow_from_dict should never be removed so
+ # existing workflow exports continue to function.
+ for input_name, input_conn in dict(input_conn_dict).iteritems():
+ if len(input_conn) == 1:
+ input_conn_dict[input_name] = input_conn[0]
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -1828,8 +1841,12 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
+ for input_name, conn_list in step.temp_input_connections.iteritems():
+ if not conn_list:
+ continue
+ if not isinstance(conn_list, list): # Older style singleton connection
+ conn_list = [conn_list]
+ for conn_dict in conn_list:
conn = model.WorkflowStepConnection()
conn.input_step = step
conn.input_name = input_name
https://bitbucket.org/galaxy/galaxy-central/changeset/ba74363c65be/
changeset: ba74363c65be
user: dannon
date: 2012-10-24 15:00:30
summary: Merge
affected #: 6 files
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py
+++ b/lib/galaxy/tools/__init__.py
@@ -2245,6 +2245,13 @@
values = input_values[ input.name ]
current = values["__current_case__"]
wrap_values( input.cases[current].inputs, values )
+ elif isinstance( input, DataToolParameter ) and input.multiple:
+ values = input_values[ input.name ]
+ input_values[ input.name ] = \
+ [DatasetFilenameWrapper( value,
+ datatypes_registry = self.app.datatypes_registry,
+ tool = self,
+ name = input.name ) for value in values]
elif isinstance( input, DataToolParameter ):
## FIXME: We're populating param_dict with conversions when
## wrapping values, this should happen as a separate
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py
+++ b/lib/galaxy/tools/parameters/basic.py
@@ -1550,6 +1550,10 @@
return value
elif isinstance( value, DummyDataset ):
return None
+ elif isinstance( value, list) and len(value) > 0 and isinstance( value[0], DummyDataset):
+ return None
+ elif isinstance( value, list ):
+ return ",".join( [ val if isinstance( val, str ) else str(val.id) for val in value] )
return value.id
def to_python( self, value, app ):
@@ -1557,6 +1561,10 @@
# indicates that the dataset is optional, while '' indicates that it is not.
if value is None or value == '' or value == 'None':
return value
+ if isinstance(value, str) and value.find(",") > -1:
+ values = value.split(",")
+ # TODO: Optimize. -John
+ return [app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( val ) ) for val in values]
return app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( value ) )
def to_param_dict_string( self, value, other_values={} ):
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/webapps/galaxy/controllers/tool_runner.py
--- a/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
+++ b/lib/galaxy/webapps/galaxy/controllers/tool_runner.py
@@ -189,6 +189,12 @@
if isinstance( value, UnvalidatedValue ):
return str( value )
if isinstance( input, DataToolParameter ):
+ if isinstance(value,list):
+ values = []
+ for val in value:
+ if val not in history.datasets and val in hda_source_dict:
+ values.append( hda_source_dict[ val ])
+ return values
if value not in history.datasets and value in hda_source_dict:
return hda_source_dict[ value ]
visit_input_values( tool.inputs, params_objects, rerun_callback )
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/webapps/galaxy/controllers/workflow.py
--- a/lib/galaxy/webapps/galaxy/controllers/workflow.py
+++ b/lib/galaxy/webapps/galaxy/controllers/workflow.py
@@ -741,12 +741,14 @@
}
# Connections
input_connections = step.input_connections
+ multiple_input = {} # Boolean value indicating if this can be mutliple
if step.type is None or step.type == 'tool':
# Determine full (prefixed) names of valid input datasets
data_input_names = {}
def callback( input, value, prefixed_name, prefixed_label ):
if isinstance( input, DataToolParameter ):
data_input_names[ prefixed_name ] = True
+ multiple_input[input.name] = input.multiple
visit_input_values( module.tool.inputs, module.state.inputs, callback )
# Filter
# FIXME: this removes connection without displaying a message currently!
@@ -766,8 +768,14 @@
# Encode input connections as dictionary
input_conn_dict = {}
for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ conn_dict = dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ if conn.input_name in multiple_input:
+ if conn.input_name in input_conn_dict:
+ input_conn_dict[ conn.input_name ].append( conn_dict )
+ else:
+ input_conn_dict[ conn.input_name ] = [ conn_dict ]
+ else:
+ input_conn_dict[ conn.input_name ] = conn_dict
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -832,13 +840,15 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
- conn = model.WorkflowStepConnection()
- conn.input_step = step
- conn.input_name = input_name
- conn.output_name = conn_dict['output_name']
- conn.output_step = steps_by_external_id[ conn_dict['id'] ]
+ for input_name, conns in step.temp_input_connections.iteritems():
+ if conns:
+ conn_dicts = conns if isinstance(conns,list) else [conns]
+ for conn_dict in conn_dicts:
+ conn = model.WorkflowStepConnection()
+ conn.input_step = step
+ conn.input_name = input_name
+ conn.output_name = conn_dict['output_name']
+ conn.output_step = steps_by_external_id[ conn_dict['id'] ]
del step.temp_input_connections
# Order the steps if possible
attach_ordered_steps( workflow, steps )
@@ -1346,8 +1356,13 @@
for step in workflow.steps:
step.upgrade_messages = {}
# Connections by input name
- step.input_connections_by_name = \
- dict( ( conn.input_name, conn ) for conn in step.input_connections )
+ input_connections_by_name = {}
+ for conn in step.input_connections:
+ input_name = conn.input_name
+ if not input_name in input_connections_by_name:
+ input_connections_by_name[input_name] = []
+ input_connections_by_name[input_name].append(conn)
+ step.input_connections_by_name = input_connections_by_name
# Extract just the arguments for this step by prefix
p = "%s|" % step.id
l = len(p)
@@ -1405,11 +1420,17 @@
tool = trans.app.toolbox.get_tool( step.tool_id )
# Connect up
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if prefixed_name in step.input_connections_by_name:
conn = step.input_connections_by_name[ prefixed_name ]
- return outputs[ conn.output_step.id ][ conn.output_name ]
+ if input.multiple:
+ replacement = [outputs[ c.output_step.id ][ c.output_name ] for c in conn]
+ else:
+ replacement = outputs[ conn[0].output_step.id ][ conn[0].output_name ]
+ return replacement
try:
+ # Replace DummyDatasets with historydatasetassociations
visit_input_values( tool.inputs, step.state.inputs, callback )
except KeyError, k:
error( "Error due to input mapping of '%s' in '%s'. A common cause of this is conditional outputs that cannot be determined until runtime, please review your workflow." % (tool.name, k.message))
@@ -1726,9 +1747,22 @@
input_connections = [ conn for conn in input_connections if conn.input_name in data_input_names ]
# Encode input connections as dictionary
input_conn_dict = {}
- for conn in input_connections:
- input_conn_dict[ conn.input_name ] = \
- dict( id=conn.output_step.order_index, output_name=conn.output_name )
+ unique_input_names = set( [conn.input_name for conn in input_connections] )
+ for input_name in unique_input_names:
+ input_conn_dict[ input_name ] = \
+ [ dict( id=conn.output_step.order_index, output_name=conn.output_name ) for conn in input_connections if conn.input_name == input_name ]
+ # Preserve backward compatability. Previously Galaxy
+ # assumed input connections would be dictionaries not
+ # lists of dictionaries, so replace any singleton list
+ # with just the dictionary so that workflows exported from
+ # newer Galaxy instances can be used with older Galaxy
+ # instances if they do no include multiple input
+ # tools. This should be removed at some point. Mirrored
+ # hack in _workflow_from_dict should never be removed so
+ # existing workflow exports continue to function.
+ for input_name, input_conn in dict(input_conn_dict).iteritems():
+ if len(input_conn) == 1:
+ input_conn_dict[input_name] = input_conn[0]
step_dict['input_connections'] = input_conn_dict
# Position
step_dict['position'] = step.position
@@ -1793,8 +1827,12 @@
# Second pass to deal with connections between steps
for step in steps:
# Input connections
- for input_name, conn_dict in step.temp_input_connections.iteritems():
- if conn_dict:
+ for input_name, conn_list in step.temp_input_connections.iteritems():
+ if not conn_list:
+ continue
+ if not isinstance(conn_list, list): # Older style singleton connection
+ conn_list = [conn_list]
+ for conn_dict in conn_list:
conn = model.WorkflowStepConnection()
conn.input_step = step
conn.input_name = input_name
@@ -2040,7 +2078,11 @@
# still need to clean them up so we can serialize
# if not( prefix ):
if tmp: #this is false for a non-set optional dataset
- associations.append( ( tmp.hid, prefix + key ) )
+ if not isinstance(tmp, list):
+ associations.append( ( tmp.hid, prefix + key ) )
+ else:
+ associations.extend( [ (t.hid, prefix + key) for t in tmp] )
+
# Cleanup the other deprecated crap associated with datasets
# as well. Worse, for nested datasets all the metadata is
# being pushed into the root. FIXME: MUST REMOVE SOON
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 lib/galaxy/workflow/modules.py
--- a/lib/galaxy/workflow/modules.py
+++ b/lib/galaxy/workflow/modules.py
@@ -263,6 +263,7 @@
data_inputs.append( dict(
name=prefixed_name,
label=prefixed_label,
+ multiple=input.multiple,
extensions=input.extensions ) )
visit_input_values( self.tool.inputs, self.state.inputs, callback )
return data_inputs
@@ -336,9 +337,14 @@
# Any connected input needs to have value DummyDataset (these
# are not persisted so we need to do it every time)
def callback( input, value, prefixed_name, prefixed_label ):
+ replacement = None
if isinstance( input, DataToolParameter ):
if connections is None or prefixed_name in input_connections_by_name:
- return DummyDataset()
+ if input.multiple:
+ replacement = [] if not connections else [DummyDataset() for conn in connections]
+ else:
+ replacement = DummyDataset()
+ return replacement
visit_input_values( self.tool.inputs, self.state.inputs, callback )
class WorkflowModuleFactory( object ):
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 static/scripts/galaxy.workflow_editor.canvas.js
--- a/static/scripts/galaxy.workflow_editor.canvas.js
+++ b/static/scripts/galaxy.workflow_editor.canvas.js
@@ -34,16 +34,17 @@
OutputTerminal.prototype = new Terminal();
-function InputTerminal( element, datatypes ) {
+function InputTerminal( element, datatypes, multiple ) {
Terminal.call( this, element );
this.datatypes = datatypes;
+ this.multiple = multiple
}
InputTerminal.prototype = new Terminal();
$.extend( InputTerminal.prototype, {
can_accept: function ( other ) {
- if ( this.connectors.length < 1 ) {
+ if ( this.connectors.length < 1 || this.multiple) {
for ( var t in this.datatypes ) {
var cat_outputs = new Array();
cat_outputs = cat_outputs.concat(other.datatypes);
@@ -111,6 +112,9 @@
var relativeTop = function( e ) {
return $(e).offset().top - canvas_container.offset().top;
};
+ if (!this.handle1 || !this.handle2) {
+ return;
+ }
// Find the position of each handle
var start_x = relativeLeft( this.handle1.element ) + 5;
var start_y = relativeTop( this.handle1.element ) + 5;
@@ -163,10 +167,10 @@
this.tool_errors = {};
}
$.extend( Node.prototype, {
- enable_input_terminal : function( elements, name, types ) {
+ enable_input_terminal : function( elements, name, types, multiple ) {
var node = this;
$(elements).each( function() {
- var terminal = this.terminal = new InputTerminal( this, types );
+ var terminal = this.terminal = new InputTerminal( this, types, multiple );
terminal.node = node;
terminal.name = name;
$(this).bind( "dropinit", function( e, d ) {
@@ -174,9 +178,13 @@
// compatible type
return $(d.drag).hasClass( "output-terminal" ) && terminal.can_accept( d.drag.terminal );
}).bind( "dropstart", function( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#BBFFBB";
+ }
}).bind( "dropend", function ( e, d ) {
- d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ if (d.proxy.terminal) {
+ d.proxy.terminal.connectors[0].inner_color = "#FFFFFF";
+ }
}).bind( "drop", function( e, d ) {
( new Connector( d.drag.terminal, terminal ) ).redraw();
}).bind( "hover", function() {
@@ -190,7 +198,9 @@
$("<div class='buttons'></div>").append(
$("<img/>").attr("src", galaxy_paths.attributes.image_path + '/delete_icon.png').click( function() {
$.each( terminal.connectors, function( _, x ) {
- x.destroy();
+ if (x) {
+ x.destroy();
+ }
});
t.remove();
})))
@@ -304,7 +314,7 @@
var ibox = $("<div class='inputs'></div>").appendTo( b );
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
var ib = $("<div class='form-row dataRow input-data-row' name='" + input.name + "'>" + input.label + "</div>" );
ib.css({ position:'absolute',
left: -1000,
@@ -407,7 +417,7 @@
var old = old_body.find( "div.input-data-row");
$.each( data.data_inputs, function( i, input ) {
var t = $("<div class='terminal input-terminal'></div>");
- node.enable_input_terminal( t, input.name, input.extensions );
+ node.enable_input_terminal( t, input.name, input.extensions, input.multiple );
// If already connected save old connection
old_body.find( "div[name='" + input.name + "']" ).each( function() {
$(this).find( ".input-terminal" ).each( function() {
@@ -545,8 +555,10 @@
input_connections[ t.name ] = null;
// There should only be 0 or 1 connectors, so this is
// really a sneaky if statement
+ var cons = []
$.each( t.connectors, function ( i, c ) {
- input_connections[ t.name ] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ cons[i] = { id: c.handle1.node.id, output_name: c.handle1.name };
+ input_connections[ t.name ] = cons;
});
});
var post_job_actions = {};
@@ -617,11 +629,21 @@
var node = wf.nodes[id];
$.each( step.input_connections, function( k, v ) {
if ( v ) {
- var other_node = wf.nodes[ v.id ];
- var c = new Connector();
- c.connect( other_node.output_terminals[ v.output_name ],
- node.input_terminals[ k ] );
- c.redraw();
+ if ($.isArray(v)) {
+ $.each( v, function (l,x ) {
+ var other_node = wf.nodes[ x.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ x.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ });
+ } else {
+ var other_node = wf.nodes[ v.id ];
+ var c = new Connector();
+ c.connect( other_node.output_terminals[ v.output_name ],
+ node.input_terminals[ k ] );
+ c.redraw();
+ }
}
});
if(using_workflow_outputs && node.type === 'tool'){
https://bitbucket.org/galaxy/galaxy-central/changeset/10342b1b167a/
changeset: 10342b1b167a
user: dannon
date: 2012-10-24 16:31:43
summary: Merge
affected #: 4 files
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/web/framework/helpers/grids.py
--- a/lib/galaxy/web/framework/helpers/grids.py
+++ b/lib/galaxy/web/framework/helpers/grids.py
@@ -418,6 +418,13 @@
def sort( self, trans, query, ascending, column_name=None ):
"""Sort query using this column."""
return GridColumn.sort( self, trans, query, ascending, column_name=column_name )
+ def get_single_filter( self, user, a_filter ):
+ if self.key.find( '.' ) > -1:
+ a_key = self.key.split( '.' )[1]
+ else:
+ a_key = self.key
+ model_class_key_field = getattr( self.model_class, a_key )
+ return model_class_key_field == a_filter
class IntegerColumn( TextColumn ):
"""
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/webapps/community/controllers/admin.py
--- a/lib/galaxy/webapps/community/controllers/admin.py
+++ b/lib/galaxy/webapps/community/controllers/admin.py
@@ -292,28 +292,25 @@
]
class AdminRepositoryGrid( RepositoryGrid ):
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository ):
+ if repository.deleted:
+ return 'yes'
+ return ''
columns = [ RepositoryGrid.NameColumn( "Name",
key="name",
link=( lambda item: dict( operation="view_or_manage_repository", id=item.id ) ),
attach_popup=True ),
- RepositoryGrid.DescriptionColumn( "Synopsis",
- key="description",
- attach_popup=False ),
- RepositoryGrid.MetadataRevisionColumn( "Metadata Revisions" ),
RepositoryGrid.UserColumn( "Owner",
model_class=model.User,
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- RepositoryGrid.EmailAlertsColumn( "Alert", attach_popup=False ),
- RepositoryGrid.DeprecatedColumn( "Deprecated", attach_popup=False ),
+ RepositoryGrid.DeprecatedColumn( "Deprecated", key="deprecated", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" ) ]
- columns.append( grids.MulticolFilterColumn( "Search repository name, description",
- cols_to_filter=[ columns[0], columns[1] ],
+ DeletedColumn( "Deleted", key="deleted", attach_popup=False ) ]
+ columns.append( grids.MulticolFilterColumn( "Search repository name",
+ cols_to_filter=[ columns[0] ],
key="free-text-search",
visible=False,
filterable="standard" ) )
@@ -327,6 +324,10 @@
condition=( lambda item: item.deleted ),
async_compatible=False ) )
standard_filters = []
+ default_filter = {}
+ def build_initial_query( self, trans, **kwd ):
+ return trans.sa_session.query( model.Repository ) \
+ .join( model.User.table )
class RepositoryMetadataGrid( grids.Grid ):
class IdColumn( grids.IntegerColumn ):
@@ -335,6 +336,9 @@
class NameColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
return repository_metadata.repository.name
+ class OwnerColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ return repository_metadata.repository.user.username
class RevisionColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
repository = repository_metadata.repository
@@ -343,42 +347,60 @@
return "%s:%s" % ( str( ctx.rev() ), repository_metadata.changeset_revision )
class ToolsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- tools_str = ''
+ tools_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'tools' in metadata:
- for tool_metadata_dict in metadata[ 'tools' ]:
- tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ # We used to display the following, but grid was too cluttered.
+ #for tool_metadata_dict in metadata[ 'tools' ]:
+ # tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ return '%d' % len( metadata[ 'tools' ] )
return tools_str
class DatatypesColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- datatypes_str = ''
+ datatypes_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'datatypes' in metadata:
- for datatype_metadata_dict in metadata[ 'datatypes' ]:
- datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ # We used to display the following, but grid was too cluttered.
+ #for datatype_metadata_dict in metadata[ 'datatypes' ]:
+ # datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ return '%d' % len( metadata[ 'datatypes' ] )
return datatypes_str
class WorkflowsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- workflows_str = ''
+ workflows_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'workflows' in metadata:
- workflows_str += '<b>Workflows:</b><br/>'
+ # We used to display the following, but grid was too cluttered.
+ #workflows_str += '<b>Workflows:</b><br/>'
# metadata[ 'workflows' ] is a list of tuples where each contained tuple is
# [ <relative path to the .ga file in the repository>, <exported workflow dict> ]
- workflow_tups = metadata[ 'workflows' ]
- workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
- for workflow_metadata_dict in workflow_metadata_dicts:
- workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ #workflow_tups = metadata[ 'workflows' ]
+ #workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
+ #for workflow_metadata_dict in workflow_metadata_dicts:
+ # workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ return '%d' % len( metadata[ 'workflows' ] )
return workflows_str
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deleted:
+ return 'yes'
+ return ''
+ class DeprecatedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deprecated:
+ return 'yes'
+ return ''
class MaliciousColumn( grids.BooleanColumn ):
def get_value( self, trans, grid, repository_metadata ):
- return repository_metadata.malicious
+ if repository_metadata.malicious:
+ return 'yes'
+ return ''
# Grid definition
title = "Repository Metadata"
model_class = model.RepositoryMetadata
@@ -393,16 +415,14 @@
model_class=model.Repository,
link=( lambda item: dict( operation="view_or_manage_repository_revision", id=item.id ) ),
attach_popup=True ),
- RevisionColumn( "Revision",
- attach_popup=False ),
- ToolsColumn( "Tools",
- attach_popup=False ),
- DatatypesColumn( "Datatypes",
- attach_popup=False ),
- WorkflowsColumn( "Workflows",
- attach_popup=False ),
- MaliciousColumn( "Malicious",
- attach_popup=False )
+ OwnerColumn( "Owner", attach_popup=False ),
+ RevisionColumn( "Revision", attach_popup=False ),
+ ToolsColumn( "Tools", attach_popup=False ),
+ DatatypesColumn( "Datatypes", attach_popup=False ),
+ WorkflowsColumn( "Workflows", attach_popup=False ),
+ DeletedColumn( "Deleted", attach_popup=False ),
+ DeprecatedColumn( "Deprecated", attach_popup=False ),
+ MaliciousColumn( "Malicious", attach_popup=False )
]
operations = [ grids.GridOperation( "Delete",
allow_multiple=False,
@@ -416,8 +436,7 @@
use_paging = True
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.RepositoryMetadata ) \
- .join( model.Repository.table ) \
- .filter( model.Repository.table.c.deprecated == False )
+ .join( model.Repository.table )
class AdminController( BaseUIController, Admin ):
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -39,7 +39,8 @@
if category.repositories:
viewable_repositories = 0
for rca in category.repositories:
- viewable_repositories += 1
+ if not rca.repository.deleted and not rca.repository.deprecated:
+ viewable_repositories += 1
return viewable_repositories
return 0
title = "Categories"
@@ -191,8 +192,6 @@
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- grids.CommunityRatingColumn( "Average Rating", key="rating" ),
- EmailAlertsColumn( "Alert", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
EmailColumn( "Email",
model_class=model.User,
@@ -201,11 +200,7 @@
RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name, description",
cols_to_filter=[ columns[0], columns[1] ],
@@ -222,7 +217,9 @@
preserve_state = False
use_paging = True
def build_initial_query( self, trans, **kwd ):
- return trans.sa_session.query( self.model_class ) \
+ return trans.sa_session.query( model.Repository ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -257,7 +254,8 @@
async_compatible=False ) ]
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.user_id == trans.user.id ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -283,7 +281,8 @@
filterable="standard" ) )
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( and_( model.Repository.table.c.user_id == trans.user.id,
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id,
model.Repository.table.c.deprecated == True ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -341,11 +340,7 @@
RepositoryGrid.RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name",
cols_to_filter=[ columns[0] ],
@@ -375,7 +370,8 @@
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
# Return an empty query.
- return []
+ return trans.sa_session.query( model.Repository ) \
+ .filter( model.Repository.table.c.id < 0 )
class ValidRepositoryGrid( RepositoryGrid ):
# This grid filters out repositories that have been marked as deprecated.
@@ -435,7 +431,8 @@
if 'id' in kwd:
# The user is browsing categories of valid repositories, so filter the request by the received id, which is a category id.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.join( model.RepositoryCategoryAssociation.table ) \
@@ -444,7 +441,8 @@
model.RepositoryMetadata.table.c.downloadable == True ) )
# The user performed a free text search on the ValidCategoryGrid.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -501,7 +499,8 @@
changeset_revision ) )
return trans.sa_session.query( model.RepositoryMetadata ) \
.join( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.filter( or_( *clause_list ) ) \
.order_by( model.Repository.name )
@@ -556,7 +555,8 @@
# RepositoryGrid on the grid.mako template for the CategoryGrid. See ~/templates/webapps/community/category/grid.mako. Since we
# are searching repositories and not categories, redirect to browse_repositories().
if 'id' in kwd and 'f-free-text-search' in kwd and kwd[ 'id' ] == kwd[ 'f-free-text-search' ]:
- # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository id to pass on.
+ # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository
+ # id to pass on.
try:
repository = get_repository_by_name( trans, kwd[ 'id' ] )
kwd[ 'id' ] = trans.security.encode_id( repository.id )
diff -r ba74363c65bee1e90a4ae6346cbd7f2d3ff9a6b3 -r 10342b1b167a6f7290b8f3dc1d00c2d2b77e046f lib/galaxy/webapps/community/model/mapping.py
--- a/lib/galaxy/webapps/community/model/mapping.py
+++ b/lib/galaxy/webapps/community/model/mapping.py
@@ -222,7 +222,7 @@
properties=dict( repositories=relation( RepositoryCategoryAssociation,
secondary=Repository.table,
primaryjoin=( Category.table.c.id == RepositoryCategoryAssociation.table.c.category_id ),
- secondaryjoin=( ( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) & ( Repository.table.c.deprecated == False ) ) ) ) )
+ secondaryjoin=( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) ) ) )
assign_mapper( context, Repository, Repository.table,
properties = dict(
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/f56ec1622b27/
changeset: f56ec1622b27
user: greg
date: 2012-10-24 16:24:11
summary: Fix for the Category.repositories mapper.
affected #: 1 file
diff -r 25ff502e7faa7e6030545a3035b5e98ee584ab5a -r f56ec1622b278da8d759805e70d3a7ac61596328 lib/galaxy/webapps/community/model/mapping.py
--- a/lib/galaxy/webapps/community/model/mapping.py
+++ b/lib/galaxy/webapps/community/model/mapping.py
@@ -222,7 +222,7 @@
properties=dict( repositories=relation( RepositoryCategoryAssociation,
secondary=Repository.table,
primaryjoin=( Category.table.c.id == RepositoryCategoryAssociation.table.c.category_id ),
- secondaryjoin=( ( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) & ( Repository.table.c.deprecated == False ) ) ) ) )
+ secondaryjoin=( RepositoryCategoryAssociation.table.c.repository_id == Repository.table.c.id ) ) ) )
assign_mapper( context, Repository, Repository.table,
properties = dict(
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Grid cleanup in the tool shed repository controller.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/25ff502e7faa/
changeset: 25ff502e7faa
user: greg
date: 2012-10-24 16:23:08
summary: Grid cleanup in the tool shed repository controller.
affected #: 1 file
diff -r 2350e6e0ca9c6ad885de9fac9d196dc07c45d074 -r 25ff502e7faa7e6030545a3035b5e98ee584ab5a lib/galaxy/webapps/community/controllers/repository.py
--- a/lib/galaxy/webapps/community/controllers/repository.py
+++ b/lib/galaxy/webapps/community/controllers/repository.py
@@ -39,7 +39,8 @@
if category.repositories:
viewable_repositories = 0
for rca in category.repositories:
- viewable_repositories += 1
+ if not rca.repository.deleted and not rca.repository.deprecated:
+ viewable_repositories += 1
return viewable_repositories
return 0
title = "Categories"
@@ -191,8 +192,6 @@
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- grids.CommunityRatingColumn( "Average Rating", key="rating" ),
- EmailAlertsColumn( "Alert", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
EmailColumn( "Email",
model_class=model.User,
@@ -201,11 +200,7 @@
RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name, description",
cols_to_filter=[ columns[0], columns[1] ],
@@ -222,7 +217,9 @@
preserve_state = False
use_paging = True
def build_initial_query( self, trans, **kwd ):
- return trans.sa_session.query( self.model_class ) \
+ return trans.sa_session.query( model.Repository ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -257,7 +254,8 @@
async_compatible=False ) ]
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.user_id == trans.user.id ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
@@ -283,7 +281,8 @@
filterable="standard" ) )
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.Repository ) \
- .filter( and_( model.Repository.table.c.user_id == trans.user.id,
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.user_id == trans.user.id,
model.Repository.table.c.deprecated == True ) ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -341,11 +340,7 @@
RepositoryGrid.RepositoryCategoryColumn( "Category",
model_class=model.Category,
key="Category.name",
- visible=False ),
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" )
+ visible=False )
]
columns.append( grids.MulticolFilterColumn( "Search repository name",
cols_to_filter=[ columns[0] ],
@@ -375,7 +370,8 @@
.outerjoin( model.RepositoryCategoryAssociation.table ) \
.outerjoin( model.Category.table )
# Return an empty query.
- return []
+ return trans.sa_session.query( model.Repository ) \
+ .filter( model.Repository.table.c.id < 0 )
class ValidRepositoryGrid( RepositoryGrid ):
# This grid filters out repositories that have been marked as deprecated.
@@ -435,7 +431,8 @@
if 'id' in kwd:
# The user is browsing categories of valid repositories, so filter the request by the received id, which is a category id.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.join( model.RepositoryCategoryAssociation.table ) \
@@ -444,7 +441,8 @@
model.RepositoryMetadata.table.c.downloadable == True ) )
# The user performed a free text search on the ValidCategoryGrid.
return trans.sa_session.query( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.RepositoryMetadata.table ) \
.join( model.User.table ) \
.outerjoin( model.RepositoryCategoryAssociation.table ) \
@@ -501,7 +499,8 @@
changeset_revision ) )
return trans.sa_session.query( model.RepositoryMetadata ) \
.join( model.Repository ) \
- .filter( model.Repository.table.c.deprecated == False ) \
+ .filter( and_( model.Repository.table.c.deleted == False,
+ model.Repository.table.c.deprecated == False ) ) \
.join( model.User.table ) \
.filter( or_( *clause_list ) ) \
.order_by( model.Repository.name )
@@ -556,7 +555,8 @@
# RepositoryGrid on the grid.mako template for the CategoryGrid. See ~/templates/webapps/community/category/grid.mako. Since we
# are searching repositories and not categories, redirect to browse_repositories().
if 'id' in kwd and 'f-free-text-search' in kwd and kwd[ 'id' ] == kwd[ 'f-free-text-search' ]:
- # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository id to pass on.
+ # The value of 'id' has been set to the search string, which is a repository name. We'll try to get the desired encoded repository
+ # id to pass on.
try:
repository = get_repository_by_name( trans, kwd[ 'id' ] )
kwd[ 'id' ] = trans.security.encode_id( repository.id )
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Grid cleanup in the tool shed admin controller.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/2350e6e0ca9c/
changeset: 2350e6e0ca9c
user: greg
date: 2012-10-24 16:21:45
summary: Grid cleanup in the tool shed admin controller.
affected #: 1 file
diff -r 25a93da96e03431557e1418fd4cf70ab7b6ec427 -r 2350e6e0ca9c6ad885de9fac9d196dc07c45d074 lib/galaxy/webapps/community/controllers/admin.py
--- a/lib/galaxy/webapps/community/controllers/admin.py
+++ b/lib/galaxy/webapps/community/controllers/admin.py
@@ -292,28 +292,25 @@
]
class AdminRepositoryGrid( RepositoryGrid ):
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository ):
+ if repository.deleted:
+ return 'yes'
+ return ''
columns = [ RepositoryGrid.NameColumn( "Name",
key="name",
link=( lambda item: dict( operation="view_or_manage_repository", id=item.id ) ),
attach_popup=True ),
- RepositoryGrid.DescriptionColumn( "Synopsis",
- key="description",
- attach_popup=False ),
- RepositoryGrid.MetadataRevisionColumn( "Metadata Revisions" ),
RepositoryGrid.UserColumn( "Owner",
model_class=model.User,
link=( lambda item: dict( operation="repositories_by_user", id=item.id ) ),
attach_popup=False,
key="User.username" ),
- RepositoryGrid.EmailAlertsColumn( "Alert", attach_popup=False ),
- RepositoryGrid.DeprecatedColumn( "Deprecated", attach_popup=False ),
+ RepositoryGrid.DeprecatedColumn( "Deprecated", key="deprecated", attach_popup=False ),
# Columns that are valid for filtering but are not visible.
- grids.DeletedColumn( "Deleted",
- key="deleted",
- visible=False,
- filterable="advanced" ) ]
- columns.append( grids.MulticolFilterColumn( "Search repository name, description",
- cols_to_filter=[ columns[0], columns[1] ],
+ DeletedColumn( "Deleted", key="deleted", attach_popup=False ) ]
+ columns.append( grids.MulticolFilterColumn( "Search repository name",
+ cols_to_filter=[ columns[0] ],
key="free-text-search",
visible=False,
filterable="standard" ) )
@@ -327,6 +324,10 @@
condition=( lambda item: item.deleted ),
async_compatible=False ) )
standard_filters = []
+ default_filter = {}
+ def build_initial_query( self, trans, **kwd ):
+ return trans.sa_session.query( model.Repository ) \
+ .join( model.User.table )
class RepositoryMetadataGrid( grids.Grid ):
class IdColumn( grids.IntegerColumn ):
@@ -335,6 +336,9 @@
class NameColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
return repository_metadata.repository.name
+ class OwnerColumn( grids.TextColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ return repository_metadata.repository.user.username
class RevisionColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
repository = repository_metadata.repository
@@ -343,42 +347,60 @@
return "%s:%s" % ( str( ctx.rev() ), repository_metadata.changeset_revision )
class ToolsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- tools_str = ''
+ tools_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'tools' in metadata:
- for tool_metadata_dict in metadata[ 'tools' ]:
- tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ # We used to display the following, but grid was too cluttered.
+ #for tool_metadata_dict in metadata[ 'tools' ]:
+ # tools_str += '%s <b>%s</b><br/>' % ( tool_metadata_dict[ 'id' ], tool_metadata_dict[ 'version' ] )
+ return '%d' % len( metadata[ 'tools' ] )
return tools_str
class DatatypesColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- datatypes_str = ''
+ datatypes_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'datatypes' in metadata:
- for datatype_metadata_dict in metadata[ 'datatypes' ]:
- datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ # We used to display the following, but grid was too cluttered.
+ #for datatype_metadata_dict in metadata[ 'datatypes' ]:
+ # datatypes_str += '%s<br/>' % datatype_metadata_dict[ 'extension' ]
+ return '%d' % len( metadata[ 'datatypes' ] )
return datatypes_str
class WorkflowsColumn( grids.TextColumn ):
def get_value( self, trans, grid, repository_metadata ):
- workflows_str = ''
+ workflows_str = '0'
if repository_metadata:
metadata = repository_metadata.metadata
if metadata:
if 'workflows' in metadata:
- workflows_str += '<b>Workflows:</b><br/>'
+ # We used to display the following, but grid was too cluttered.
+ #workflows_str += '<b>Workflows:</b><br/>'
# metadata[ 'workflows' ] is a list of tuples where each contained tuple is
# [ <relative path to the .ga file in the repository>, <exported workflow dict> ]
- workflow_tups = metadata[ 'workflows' ]
- workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
- for workflow_metadata_dict in workflow_metadata_dicts:
- workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ #workflow_tups = metadata[ 'workflows' ]
+ #workflow_metadata_dicts = [ workflow_tup[1] for workflow_tup in workflow_tups ]
+ #for workflow_metadata_dict in workflow_metadata_dicts:
+ # workflows_str += '%s<br/>' % workflow_metadata_dict[ 'name' ]
+ return '%d' % len( metadata[ 'workflows' ] )
return workflows_str
+ class DeletedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deleted:
+ return 'yes'
+ return ''
+ class DeprecatedColumn( grids.BooleanColumn ):
+ def get_value( self, trans, grid, repository_metadata ):
+ if repository_metadata.repository.deprecated:
+ return 'yes'
+ return ''
class MaliciousColumn( grids.BooleanColumn ):
def get_value( self, trans, grid, repository_metadata ):
- return repository_metadata.malicious
+ if repository_metadata.malicious:
+ return 'yes'
+ return ''
# Grid definition
title = "Repository Metadata"
model_class = model.RepositoryMetadata
@@ -393,16 +415,14 @@
model_class=model.Repository,
link=( lambda item: dict( operation="view_or_manage_repository_revision", id=item.id ) ),
attach_popup=True ),
- RevisionColumn( "Revision",
- attach_popup=False ),
- ToolsColumn( "Tools",
- attach_popup=False ),
- DatatypesColumn( "Datatypes",
- attach_popup=False ),
- WorkflowsColumn( "Workflows",
- attach_popup=False ),
- MaliciousColumn( "Malicious",
- attach_popup=False )
+ OwnerColumn( "Owner", attach_popup=False ),
+ RevisionColumn( "Revision", attach_popup=False ),
+ ToolsColumn( "Tools", attach_popup=False ),
+ DatatypesColumn( "Datatypes", attach_popup=False ),
+ WorkflowsColumn( "Workflows", attach_popup=False ),
+ DeletedColumn( "Deleted", attach_popup=False ),
+ DeprecatedColumn( "Deprecated", attach_popup=False ),
+ MaliciousColumn( "Malicious", attach_popup=False )
]
operations = [ grids.GridOperation( "Delete",
allow_multiple=False,
@@ -416,8 +436,7 @@
use_paging = True
def build_initial_query( self, trans, **kwd ):
return trans.sa_session.query( model.RepositoryMetadata ) \
- .join( model.Repository.table ) \
- .filter( model.Repository.table.c.deprecated == False )
+ .join( model.Repository.table )
class AdminController( BaseUIController, Admin ):
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0
commit/galaxy-central: greg: Override the grid's TextColumn.get_single_filter() method for the BooleanColumn so the BooleanColumn can be used in grids.
by Bitbucket 24 Oct '12
by Bitbucket 24 Oct '12
24 Oct '12
1 new commit in galaxy-central:
https://bitbucket.org/galaxy/galaxy-central/changeset/25a93da96e03/
changeset: 25a93da96e03
user: greg
date: 2012-10-24 15:51:17
summary: Override the grid's TextColumn.get_single_filter() method for the BooleanColumn so the BooleanColumn can be used in grids.
affected #: 1 file
diff -r 33db1925e701664785735160f3738dfde97fc2e6 -r 25a93da96e03431557e1418fd4cf70ab7b6ec427 lib/galaxy/web/framework/helpers/grids.py
--- a/lib/galaxy/web/framework/helpers/grids.py
+++ b/lib/galaxy/web/framework/helpers/grids.py
@@ -418,6 +418,13 @@
def sort( self, trans, query, ascending, column_name=None ):
"""Sort query using this column."""
return GridColumn.sort( self, trans, query, ascending, column_name=column_name )
+ def get_single_filter( self, user, a_filter ):
+ if self.key.find( '.' ) > -1:
+ a_key = self.key.split( '.' )[1]
+ else:
+ a_key = self.key
+ model_class_key_field = getattr( self.model_class, a_key )
+ return model_class_key_field == a_filter
class IntegerColumn( TextColumn ):
"""
Repository URL: https://bitbucket.org/galaxy/galaxy-central/
--
This is a commit notification from bitbucket.org. You are receiving
this because you have the service enabled, addressing the recipient of
this email.
1
0