details:
http://www.bx.psu.edu/hg//afs/bx.psu.edu/misc/projects/universe/clones/pu...
changeset: 1460:837f8b7698c4
user: James Taylor <james(a)jamestaylor.org>
date: Thu Jul 31 17:55:41 2008 -0400
description:
Fix for preserving values of parameters after a repeat/conditional in
Tool.update_state. This should fix the problem reported by Dan with the
subject "Re: [galaxy-bugs] Galaxy tool error report from cathy(a)bx.psu.edu".
Subject: [hg: /afs/bx.psu.edu/misc/projects/universe/clones/public] 1461: Compensatory bug
in conditional parameters that was revealed by previous
details:
http://www.bx.psu.edu/hg//afs/bx.psu.edu/misc/projects/universe/clones/pu...
changeset: 1461:6bbf70be8b03
user: James Taylor <james(a)jamestaylor.org>
date: Thu Jul 31 20:49:06 2008 -0400
description:
Compensatory bug in conditional parameters that was revealed by previous
commit.
Subject: [hg: /afs/bx.psu.edu/misc/projects/universe/clones/public] 1462: Fix for
JobWrapper.finish, dataset state no longer flushed prior to other attributes ( e.g.,
metadata) being flushed.
details:
http://www.bx.psu.edu/hg//afs/bx.psu.edu/misc/projects/universe/clones/pu...
changeset: 1462:777d9bdfd7d3
user: greg
date: Fri Aug 01 14:25:42 2008 -0400
description:
Fix for JobWrapper.finish, dataset state no longer flushed prior to other attributes (
e.g., metadata) being flushed.
Subject: [hg: /afs/bx.psu.edu/misc/projects/universe/clones/public] 1463: Fix for gops
metadata param parser when strand column not included in input dataset, added new test for
gops intersect. Fix for datatype converters - now use new approach to metadata params.
details:
http://www.bx.psu.edu/hg//afs/bx.psu.edu/misc/projects/universe/clones/pu...
changeset: 1463:66be239241bc
user: greg
date: Fri Aug 01 14:34:16 2008 -0400
description:
Fix for gops metadata param parser when strand column not included in input dataset, added
new test for gops intersect. Fix for datatype converters - now use new approach to
metadata params.
Subject: [hg: /afs/bx.psu.edu/misc/projects/universe/clones/public] 1464: Fix for the
creation of workflows from shared histories.
details:
http://www.bx.psu.edu/hg//afs/bx.psu.edu/misc/projects/universe/clones/pu...
changeset: 1464:c7acaa1bb88f
user: dan(a)scofield.bx.psu.edu
date: Fri Aug 01 15:05:54 2008 -0400
description:
Fix for the creation of workflows from shared histories.
Added a column 'copied_from_history_dataset_association' to
history_dataset_association. This requires a db change.
Subject: [hg: /afs/bx.psu.edu/misc/projects/universe/clones/public] 1465: Remove a couple
templates that I missed in the rollback.
details:
http://www.bx.psu.edu/hg//afs/bx.psu.edu/misc/projects/universe/clones/pu...
changeset: 1465:3b35cdf1689a
user: Nate Coraor <nate(a)bx.psu.edu>
date: Fri Aug 01 15:26:02 2008 -0400
description:
Remove a couple templates that I missed in the rollback.
Subject: [hg: /afs/bx.psu.edu/misc/projects/universe/clones/public] 1466: Better fix for
gops command line parser, now properly handles no strand column.
details:
http://www.bx.psu.edu/hg//afs/bx.psu.edu/misc/projects/universe/clones/pu...
changeset: 1466:91c63a82359a
user: greg
date: Mon Aug 04 10:20:23 2008 -0400
description:
Better fix for gops command line parser, now properly handles no strand column.
diffs (371 lines):
diff -r b301cae30997 -r 91c63a82359a
lib/galaxy/datatypes/converters/fastqsolexa_to_qual_converter.xml
--- a/lib/galaxy/datatypes/converters/fastqsolexa_to_qual_converter.xml Wed Jul 30
17:15:40 2008 -0400
+++ b/lib/galaxy/datatypes/converters/fastqsolexa_to_qual_converter.xml Mon Aug 04
10:20:23 2008 -0400
@@ -1,5 +1,5 @@
<tool id="CONVERTER_fastqsolexa_to_qual_0" name="Convert Fastqsolexa to
Qual">
- <command interpreter="python">fastqsolexa_to_qual_converter.py $input1
$output1 $input1.extension</command>
+ <command interpreter="python">fastqsolexa_to_qual_converter.py $input1
$output1 ${input1.extension}</command>
<inputs>
<param format="fastqsolexa" name="input1"
type="data" label="Choose Fastqsolexa file"/>
</inputs>
diff -r b301cae30997 -r 91c63a82359a
lib/galaxy/datatypes/converters/interval_to_bed_converter.xml
--- a/lib/galaxy/datatypes/converters/interval_to_bed_converter.xml Wed Jul 30 17:15:40
2008 -0400
+++ b/lib/galaxy/datatypes/converters/interval_to_bed_converter.xml Mon Aug 04 10:20:23
2008 -0400
@@ -1,7 +1,7 @@
<tool id="CONVERTER_interval_to_bed_0" name="Convert Genomic Intervals
To BED">
<!-- <description>__NOT_USED_CURRENTLY_FOR_CONVERTERS__</description>
-->
<!-- Used on the metadata edit page. -->
- <command interpreter="python">interval_to_bed_converter.py $output1
$input1 $input1_chromCol $input1_startCol $input1_endCol
$input1_strandCol</command>
+ <command interpreter="python">interval_to_bed_converter.py $output1
$input1 ${input1.metadata.chromCol} ${input1.metadata.startCol} ${input1.metadata.endCol}
${input1.metadata.strandCol}</command>
<inputs>
<page>
<param format="interval" name="input1" type="data"
label="Choose intervals"/>
diff -r b301cae30997 -r 91c63a82359a
lib/galaxy/datatypes/converters/maf_to_interval_converter.xml
--- a/lib/galaxy/datatypes/converters/maf_to_interval_converter.xml Wed Jul 30 17:15:40
2008 -0400
+++ b/lib/galaxy/datatypes/converters/maf_to_interval_converter.xml Mon Aug 04 10:20:23
2008 -0400
@@ -1,6 +1,6 @@
<tool id="CONVERTER_maf_to_interval_0" name="Convert MAF to Genomic
Intervals">
<!-- <description>__NOT_USED_CURRENTLY_FOR_CONVERTERS__</description>
-->
- <command interpreter="python">maf_to_interval_converter.py $output1
$input1 $input1_dbkey</command>
+ <command interpreter="python">maf_to_interval_converter.py $output1
$input1 ${input1.metadata.dbkey}</command>
<inputs>
<page>
<param format="maf" name="input1" type="data"
label="Choose MAF file"/>
diff -r b301cae30997 -r 91c63a82359a lib/galaxy/jobs/__init__.py
--- a/lib/galaxy/jobs/__init__.py Wed Jul 30 17:15:40 2008 -0400
+++ b/lib/galaxy/jobs/__init__.py Mon Aug 04 10:20:23 2008 -0400
@@ -400,11 +400,6 @@
else:
job.state = 'ok'
for dataset_assoc in job.output_datasets:
- if stderr:
- dataset_assoc.dataset.dataset.state = model.Dataset.states.ERROR
- else:
- dataset_assoc.dataset.dataset.state = model.Dataset.states.OK
- dataset_assoc.dataset.dataset.flush()
for dataset in dataset_assoc.dataset.dataset.history_associations: #need to
update all associated output hdas, i.e. history was shared with job running
dataset.blurb = 'done'
dataset.peek = 'no peek'
@@ -424,6 +419,11 @@
else:
dataset.blurb = "empty"
dataset.flush()
+ if stderr:
+ dataset_assoc.dataset.dataset.state = model.Dataset.states.ERROR
+ else:
+ dataset_assoc.dataset.dataset.state = model.Dataset.states.OK
+ dataset_assoc.dataset.dataset.flush()
# Save stdout and stderr
if len( stdout ) > 32768:
diff -r b301cae30997 -r 91c63a82359a lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py Wed Jul 30 17:15:40 2008 -0400
+++ b/lib/galaxy/model/__init__.py Mon Aug 04 10:20:23 2008 -0400
@@ -104,7 +104,7 @@
class HistoryDatasetAssociation( object ):
def __init__( self, id=None, hid=None, name=None, info=None, blurb=None, peek=None,
extension=None,
dbkey=None, metadata=None, history=None, dataset=None, deleted=False,
designation=None,
- parent_id=None, validation_errors=None, visible=True, create_dataset =
False ):
+ parent_id=None, copied_from_history_dataset_association = None,
validation_errors=None, visible=True, create_dataset = False ):
self.name = name or "Unnamed dataset"
self.id = id
self.hid = hid
@@ -125,6 +125,7 @@
self.dataset = dataset
self.parent_id = parent_id
self.validation_errors = validation_errors
+ self.copied_from_history_dataset_association =
copied_from_history_dataset_association
@property
def ext( self ):
@@ -252,7 +253,7 @@
return self.datatype.get_converter_types( self, datatypes_registry)
def copy( self, copy_children = False, parent_id = None ):
- des = HistoryDatasetAssociation( hid=self.hid, name=self.name, info=self.info,
blurb=self.blurb, peek=self.peek, extension=self.extension, dbkey=self.dbkey,
metadata=self._metadata, dataset = self.dataset, visible=self.visible,
deleted=self.deleted, parent_id=parent_id )
+ des = HistoryDatasetAssociation( hid=self.hid, name=self.name, info=self.info,
blurb=self.blurb, peek=self.peek, extension=self.extension, dbkey=self.dbkey,
metadata=self._metadata, dataset = self.dataset, visible=self.visible,
deleted=self.deleted, parent_id=parent_id, copied_from_history_dataset_association = self
)
des.flush()
if copy_children:
for child in self.children:
diff -r b301cae30997 -r 91c63a82359a lib/galaxy/model/mapping.py
--- a/lib/galaxy/model/mapping.py Wed Jul 30 17:15:40 2008 -0400
+++ b/lib/galaxy/model/mapping.py Mon Aug 04 10:20:23 2008 -0400
@@ -72,6 +72,7 @@
Column( "dataset_id", Integer, ForeignKey( "dataset.id" ),
index=True ),
Column( "create_time", DateTime, default=now ),
Column( "update_time", DateTime, default=now, onupdate=now ),
+ Column( "copied_from_history_dataset_association_id", Integer, ForeignKey(
"history_dataset_association.id" ), nullable=True ),
Column( "hid", Integer ),
Column( "name", TrimmedString( 255 ) ),
Column( "info", TrimmedString( 255 ) ),
@@ -250,13 +251,17 @@
history=relation(
History,
primaryjoin=( History.table.c.id ==
HistoryDatasetAssociation.table.c.history_id ) ),
+ copied_to_history_dataset_associations=relation(
+ HistoryDatasetAssociation,
+ primaryjoin=(
HistoryDatasetAssociation.table.c.copied_from_history_dataset_association_id ==
HistoryDatasetAssociation.table.c.id ),
+ backref=backref( "copied_from_history_dataset_association",
primaryjoin=( HistoryDatasetAssociation.table.c.copied_from_history_dataset_association_id
== HistoryDatasetAssociation.table.c.id ),
remote_side=[HistoryDatasetAssociation.table.c.id] ) ),
implicitly_converted_datasets=relation(
ImplicitlyConvertedDatasetAssociation,
primaryjoin=( ImplicitlyConvertedDatasetAssociation.table.c.hda_parent_id ==
HistoryDatasetAssociation.table.c.id ) ),
children=relation(
HistoryDatasetAssociation,
primaryjoin=( HistoryDatasetAssociation.table.c.parent_id ==
HistoryDatasetAssociation.table.c.id ),
- backref=backref( "parent",
remote_side=[HistoryDatasetAssociation.table.c.id] ) )
+ backref=backref( "parent", primaryjoin=(
HistoryDatasetAssociation.table.c.parent_id == HistoryDatasetAssociation.table.c.id ),
remote_side=[HistoryDatasetAssociation.table.c.id] ) )
) )
assign_mapper( context, Dataset, Dataset.table,
diff -r b301cae30997 -r 91c63a82359a lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py Wed Jul 30 17:15:40 2008 -0400
+++ b/lib/galaxy/tools/__init__.py Mon Aug 04 10:20:23 2008 -0400
@@ -731,7 +731,7 @@
for i, rep_state in enumerate( group_state ):
rep_index = rep_state['__index__']
max_index = max( max_index, rep_index )
- prefix = "%s_%d|" % ( key, rep_index )
+ rep_prefix = "%s_%d|" % ( key, rep_index )
if group_old_errors:
rep_old_errors = group_old_errors[i]
else:
@@ -740,7 +740,7 @@
input.inputs,
rep_state,
incoming,
- prefix=prefix,
+ prefix=rep_prefix,
context=context,
update_only=update_only,
old_errors=rep_old_errors,
@@ -765,9 +765,9 @@
group_state = state[input.name]
group_old_errors = old_errors.get( input.name, {} )
old_current_case = group_state['__current_case__']
- prefix = "%s|" % ( key )
+ group_prefix = "%s|" % ( key )
# Deal with the 'test' element and see if it's value changed
- test_param_key = prefix + input.test_param.name
+ test_param_key = group_prefix + input.test_param.name
test_param_error = None
test_incoming = incoming.get( test_param_key, None )
if test_param_key not in incoming \
@@ -797,7 +797,7 @@
input.cases[current_case].inputs,
group_state,
incoming,
- prefix=prefix,
+ prefix=group_prefix,
context=context,
update_only=update_only,
old_errors=group_old_errors,
diff -r b301cae30997 -r 91c63a82359a lib/galaxy/tools/util/galaxyops/__init__.py
--- a/lib/galaxy/tools/util/galaxyops/__init__.py Wed Jul 30 17:15:40 2008 -0400
+++ b/lib/galaxy/tools/util/galaxyops/__init__.py Mon Aug 04 10:20:23 2008 -0400
@@ -13,13 +13,18 @@
print >> sys.stderr, msg
sys.exit( 1 )
-# Default chrom, start, end, stran cols for a bed file
+# Default chrom, start, end, strand cols for a bed file
BED_DEFAULT_COLS = 0, 1, 2, 5
def parse_cols_arg( cols ):
"""Parse a columns command line argument into a
four-tuple"""
if cols:
- return map( lambda x: int( x ) - 1, cols.split(",") )
+ # Handle case where no strand column included - in this case, cols
+ # looks something like 1,2,3,
+ if cols.endswith( ',' ):
+ cols += '0'
+ col_list = map( lambda x: int( x ) - 1, cols.split(",") )
+ return col_list
else:
return BED_DEFAULT_COLS
diff -r b301cae30997 -r 91c63a82359a lib/galaxy/web/controllers/workflow.py
--- a/lib/galaxy/web/controllers/workflow.py Wed Jul 30 17:15:40 2008 -0400
+++ b/lib/galaxy/web/controllers/workflow.py Mon Aug 04 10:20:23 2008 -0400
@@ -676,9 +676,16 @@
if dataset.state in ( 'new', 'running', 'queued' ):
warnings.add( "Some datasets still queued or running were ignored"
)
continue
- if not dataset.creating_job_associations:
- jobs[ FakeJob( dataset ) ] = [ ( None, dataset ) ]
- for assoc in dataset.creating_job_associations:
+
+ #if this hda was copied from another, we need to find the job that created the
origial hda
+ job_hda = dataset
+ while job_hda.copied_from_history_dataset_association:
+ job_hda = job_hda.copied_from_history_dataset_association
+
+ if not job_hda.creating_job_associations:
+ jobs[ FakeJob( dataset ) ] = [ ( None, dataset ) ]
+
+ for assoc in job_hda.creating_job_associations:
job = assoc.job
if job in jobs:
jobs[ job ].append( ( assoc.name, dataset ) )
diff -r b301cae30997 -r 91c63a82359a templates/history/permissions.mako
--- a/templates/history/permissions.mako Wed Jul 30 17:15:40 2008 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-<%inherit file="/base.mako"/>
-<%def name="title()">Change Default History Permissions</%def>
-
-%if trans.user:
- <div class="toolForm">
- <div class="toolFormTitle">Change Default History
Permissions</div>
- <div class="toolFormBody">
- <form name="set_permissions" method="post">
- <div class="form-row">
- <% user_groups = [ assoc.group for assoc in trans.user.groups ] %>
- <% cur_groups = [ assoc.group for assoc in
trans.get_history().default_groups ] %>
- <div style="float: left; width: 250px; margin-right: 10px;">
- <table>
-
<tr><th>Group</th><th>In</th><th>Out</th></tr>
- %for group in user_groups:
- <tr><td>${group.name}</td><td><input
type="radio" name="group_${group.id}" value="in"
- %if group in cur_groups:
- checked
- %endif
- ></td><td><input type="radio"
name="group_${group.id}" value="out"
- %if group not in cur_groups:
- checked
- %endif
- ></td></tr>
- %endfor
- </table>
- </div>
-
- <div style="clear: both"></div>
-
- <div class="toolParamHelp" style="clear: both;">
- This will change the default permissions assigned to new datasets for
your current history.
- </div>
- <div style="clear: both"></div>
- </div>
- <div class="form-row">
- <input type="submit" name="set_permissions"
value="Save">
- </div>
- </form>
- </div>
- </div>
-%endif
\ No newline at end of file
diff -r b301cae30997 -r 91c63a82359a templates/tool_form.tmpl
--- a/templates/tool_form.tmpl Wed Jul 30 17:15:40 2008 -0400
+++ b/templates/tool_form.tmpl Mon Aug 04 10:20:23 2008 -0400
@@ -57,9 +57,9 @@
#set group_state = $tool_state[$input.name]
#set group_errors = $errors.get( $input.name, {} )
#set current_case = $group_state['__current_case__']
- #set prefix = $prefix + $input.name + "|"
- $row_for_param( $prefix, $input.test_param, $group_state, $group_errors, $context
)
- $do_inputs( $input.cases[$current_case].inputs, $group_state, $group_errors,
$prefix, $context )
+ #set group_prefix = $prefix + $input.name + "|"
+ $row_for_param( $group_prefix, $input.test_param, $group_state, $group_errors,
$context )
+ $do_inputs( $input.cases[$current_case].inputs, $group_state, $group_errors,
$group_prefix, $context )
#else
$row_for_param( $prefix, $input, $tool_state, $errors, $context )
#end if
diff -r b301cae30997 -r 91c63a82359a templates/user/permissions.mako
--- a/templates/user/permissions.mako Wed Jul 30 17:15:40 2008 -0400
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,42 +0,0 @@
-<%inherit file="/base.mako"/>
-<%def name="title()">Change Default History Permissions</%def>
-
-%if trans.user:
- <div class="toolForm">
- <div class="toolFormTitle">Change Default Permissions for new Histories
</div>
- <div class="toolFormBody">
- <form name="set_permissions" method="post">
- <div class="form-row">
- <% user_groups = [ assoc.group for assoc in trans.user.groups ] %>
- <% cur_groups = [ assoc.group for assoc in trans.user.default_groups ]
%>
- <div style="float: left; width: 250px; margin-right: 10px;">
- <table>
-
<tr><th>Group</th><th>In</th><th>Out</th></tr>
- %for group in user_groups:
- <tr><td>${group.name}</td><td><input
type="radio" name="group_${group.id}" value="in"
- %if group in cur_groups:
- checked
- %endif
- ></td><td><input type="radio"
name="group_${group.id}" value="out"
- %if group not in cur_groups:
- checked
- %endif
- ></td></tr>
- %endfor
- </table>
- </div>
-
- <div style="clear: both"></div>
-
- <div class="toolParamHelp" style="clear: both;">
- This will change the default permissions assigned to new datasets for new
histories.
- </div>
- <div style="clear: both"></div>
- </div>
- <div class="form-row">
- <input type="submit" name="set_permissions"
value="Save">
- </div>
- </form>
- </div>
- </div>
-%endif
\ No newline at end of file
diff -r b301cae30997 -r 91c63a82359a templates/workflow/editor_tool_form.mako
--- a/templates/workflow/editor_tool_form.mako Wed Jul 30 17:15:40 2008 -0400
+++ b/templates/workflow/editor_tool_form.mako Mon Aug 04 10:20:23 2008 -0400
@@ -23,10 +23,10 @@
%elif input.type == "conditional":
<% group_values = values[input.name] %>
<% current_case = group_values['__current_case__'] %>
- <% prefix = prefix + input.name + "|" %>
+ <% group_prefix = prefix + input.name + "|" %>
<% group_errors = errors.get( input.name, {} ) %>
- ${row_for_param( input.test_param, group_values[ input.test_param.name ],
group_errors, prefix )}
- ${do_inputs( input.cases[ current_case ].inputs, group_values, group_errors, prefix
)}
+ ${row_for_param( input.test_param, group_values[ input.test_param.name ],
group_errors, group_prefix )}
+ ${do_inputs( input.cases[ current_case ].inputs, group_values, group_errors,
group_prefix )}
%else:
${row_for_param( input, values[ input.name ], errors, prefix )}
%endif
diff -r b301cae30997 -r 91c63a82359a test-data/12.bed
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/12.bed Mon Aug 04 10:20:23 2008 -0400
@@ -0,0 +1,10 @@
+chr1 147962192 147962580
+chr1 115468538 115468624
+chr1 115483024 115483277
+chr1 115484165 115484501
+chr1 115485764 115485980
+chr1 115486322 115486481
+chr1 115491298 115491487
+chr1 115468538 115468624
+chr1 115483024 115483277
+chr1 115484165 115484501
diff -r b301cae30997 -r 91c63a82359a test-data/gops_intersect_no_strand_out.bed
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/test-data/gops_intersect_no_strand_out.bed Mon Aug 04 10:20:23 2008 -0400
@@ -0,0 +1,1 @@
+chr1 147962192 147962580
diff -r b301cae30997 -r 91c63a82359a tools/new_operations/intersect.xml
--- a/tools/new_operations/intersect.xml Wed Jul 30 17:15:40 2008 -0400
+++ b/tools/new_operations/intersect.xml Mon Aug 04 10:20:23 2008 -0400
@@ -50,6 +50,13 @@
<param name="returntype" value="" />
<output name="output"
file="gops_intersect_bigint_out.interval" />
</test>
+ <test>
+ <param name="input1" value="12.bed" ftype="bed"
/>
+ <param name="input2" value="1.bed" ftype="bed"
/>
+ <param name="min" value="1" />
+ <param name="returntype" value="" />
+ <output name="output"
file="gops_intersect_no_strand_out.bed" />
+ </test>
</tests>
<help>