galaxy-dev
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
October 2009
- 18 participants
- 172 discussions
Hello,
After pulling the latest version, I've encountered a problem with importing sqlalchemy.
(I haven't upgraded in a long time, so I can't be sure which changeset introduced this issue).
The symptom is that running the upload.py python script throws the following exception:
-----------------------
Traceback (most recent call last):
File "/home/gordon/projects/galaxy_servers/galaxy_fresh_copy/tools/data_source/upload.py", line 10, in
import galaxy.model
File "/home/gordon/projects/galaxy_servers/galaxy_fresh_copy/lib/galaxy/model/__init__.py", line 13, in
import galaxy.datatypes.registry
File "/home/gordon/projects/galaxy_servers/galaxy_fresh_copy/lib/galaxy/datatypes/registry.py", line 6, in
import data, tabular, interval, images, sequence, qualityscore, genetics, xml, coverage, tracks, chrominfo
File "/home/gordon/projects/galaxy_servers/galaxy_fresh_copy/lib/galaxy/datatypes/data.py", line 6, in
import metadata
File "/home/gordon/projects/galaxy_servers/galaxy_fresh_copy/lib/galaxy/datatypes/metadata.py", line 5, in
from galaxy.web import form_builder
File "/home/gordon/projects/galaxy_servers/galaxy_fresh_copy/lib/galaxy/web/__init__.py", line 5, in
from framework import expose, json, require_login, require_admin, url_for, error, form, FormBuilder
File "/home/gordon/projects/galaxy_servers/galaxy_fresh_copy/lib/galaxy/web/framework/__init__.py", line 31, in
from sqlalchemy import and_
ImportError: No module named sqlalchemy
------------------------
It happens in the following scenario:
1. clone a fresh copy of galaxy
2. run "sh setup.sh"
3. run "sh run.sh"
4. upload a data file to galaxy.
5. The file is completely uploaded (the purple rectangle step)
6. galaxy executes "upload.py" - then the problem happens.
I can also reproduce it quickly by doing:
1. clone a fresh copy of galaxy
2. run "sh setup.sh"
3. remove the "-ES" parameters from "run.sh"
4. run "sh run.sh"
In which case the following exception is thrown:
gordon@tango:~/temp/galaxy$ sh run.sh
Traceback (most recent call last):
File "./scripts/paster.py", line 27, in <module>
command.run()
File "build/bdist.solaris-2.11-i86pc/egg/paste/script/command.py", line 78, in run
File "build/bdist.solaris-2.11-i86pc/egg/paste/script/command.py", line 117, in invoke
File "build/bdist.solaris-2.11-i86pc/egg/paste/script/command.py", line 212, in run
File "build/bdist.solaris-2.11-i86pc/egg/paste/script/serve.py", line 227, in command
File "build/bdist.solaris-2.11-i86pc/egg/paste/script/serve.py", line 250, in loadapp
File "build/bdist.solaris-2.11-i86pc/egg/paste/deploy/loadwsgi.py", line 193, in loadapp
File "build/bdist.solaris-2.11-i86pc/egg/paste/deploy/loadwsgi.py", line 213, in loadobj
File "build/bdist.solaris-2.11-i86pc/egg/paste/deploy/loadwsgi.py", line 237, in loadcontext
File "build/bdist.solaris-2.11-i86pc/egg/paste/deploy/loadwsgi.py", line 267, in _loadconfig
File "build/bdist.solaris-2.11-i86pc/egg/paste/deploy/loadwsgi.py", line 397, in get_context
File "build/bdist.solaris-2.11-i86pc/egg/paste/deploy/loadwsgi.py", line 439, in _context_from_explicit
File "build/bdist.solaris-2.11-i86pc/egg/paste/deploy/loadwsgi.py", line 18, in import_string
File "/home/gordon/temp/galaxy/lib/pkg_resources.py", line 1912, in load
entry = __import__(self.module_name, globals(),globals(), ['__name__'])
File "/home/gordon/temp/galaxy/lib/galaxy/web/__init__.py", line 5, in <module>
from framework import expose, json, require_login, require_admin, url_for, error, form, FormBuilder
File "/home/gordon/temp/galaxy/lib/galaxy/web/framework/__init__.py", line 31, in <module>
from sqlalchemy import and_
ImportError: No module named sqlalchemy
My guess is that this is a mix-up of using my site-wide python modules (without "-ES") and Galaxy's eggs (with "-ES").
But I don't know how to solve it.
Also, there's the issue of executing python from inside galaxy:
On my development station, python2.5 is the default and all is well.
My production server is CentOS-5.3 - in which the default python is 2.4 (I had to change 'run.sh' to run "python2.5").
On an Ubuntu machine, the problem is reversed: Ubuntu switched to python2.6 (since 9.04) - so running just "python" runs 2.6.
So far I (wrongly) assumed I only needed to change the shell scripts (run.sh, setup.sh, manage_db.sh) to use a fixed python version.
But if galaxy runs python internally (other than tools written in python), then maybe a better solution is needed.
Any help from a python guru will be greatly appreciated - so far I can't run the upgraded version (and can't really go back since I've upgrade the DB to version 21).
Thanks!
gordon.
P.S.
I'm running Debian 5.0, with the following python (and site-wide sqlalchemy)
$ python
Python 2.5.2 (r252:60911, Jan 4 2009, 21:59:32)
[GCC 4.3.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import sqlalchemy
>>> sqlalchemy.__version__
'0.4.7p1'
>>>
2
2
Hi.
In our local install of Galaxy there is a small change to the code
that allows upload of novoindex binary files (from the novoalign
program) in a similar fashion to ab1 and scf files. This works fine
with the data upload tool.
There have been multiple changes to the library dataset upload methods
in recent releases and I have lost track of which files are
responsible for library dataset upload and checking these binary file
types.
Could someone please outline how this differs to the normal upload
process and which which scripts are used.
Many thanks
Shaun Webb
--
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.
2
1
04 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/a60c55fb0d76
changeset: 2825:a60c55fb0d76
user: Greg Von Kuster <greg(a)bx.psu.edu>
date: Sun Oct 04 10:55:43 2009 -0400
description:
Correct fix for cloning a history - resolves problem with built-in id reserved word.
1 file(s) affected in this change:
lib/galaxy/web/controllers/history.py
diffs (12 lines):
diff -r d97f4e86be45 -r a60c55fb0d76 lib/galaxy/web/controllers/history.py
--- a/lib/galaxy/web/controllers/history.py Fri Oct 02 11:02:14 2009 -0400
+++ b/lib/galaxy/web/controllers/history.py Sun Oct 04 10:55:43 2009 -0400
@@ -322,7 +322,7 @@
return self.shared_list_grid( trans, status='error', message=message, **kwargs )
# When cloning shared histories, only copy active datasets
new_kwargs = { 'clone_choice' : 'active' }
- return self.clone( trans, id, **new_kwargs )
+ return self.clone( trans, ids, **new_kwargs )
elif operation == 'unshare':
if not ids:
message = "Select a history to unshare"
1
0
Hello,
I've recently tutored a genomics course, and part of the exercise in that course was to annotated short-reads libraries, and compare the number of exons/introns/UTRs/intergenic reads and base-pairs (the exact details are irrelevant).
Galaxy seemed like the perfect tool for the exercise, and indeed was used for deriving the answers.
However, as the students (and myself) used galaxy, some usability issues surfaced.
Here's the problem set (some details omitted for brevity):
Given three libraries of long and short RNAs, what is the number of reads (and base-pairs) that overlap annotated coding-exons (same strand and opposite strand), introns, 3'UTRs and intergenic regions.
The libraries were given as BED files (already mapped to hg18).
Ostensibly, it should be easy and straight forward to do this in Galaxy:
1. upload a library BED file
2. download a gene table from UCSC
3. Use "Extract Features" to extract exons/introns/UTR3
4. Use "Filter" to separate negative/positive genes and intervals
5. Intersect above datasets
6. use "Base Coverage" to count covered base-pairs.
For the first library, the above process is fine.
It takes longer if you've never used Galaxy, but that's understandable.
One annoyance is that the labels of the datasets are not useful ("Intersect on data 12 and data 4").
If you're very organized, immediately after running each tool you rename the dataset to something meaningful.
It's annoying, because you waste more time renaming and organizing you datasets than actually running tools.
It took me no less than 30 minutes to run everything, rename and organize it (and I'm pretty good with Galaxy...).
The real problem is with the other two libraries:
I already know exactly what needs to be done - so doing it manually will be very frustrating.
But, constructing a workflow from the current history is not as useful as it first seems.
First issue (minor but annoying):
When I clicked "Extract Workflow", galaxy did a perfect job of reconstructing all my steps.
The workflow takes two input datasets, and produces many many others.
The only problem:
The two datasets (one - my library, the other - UCSC's Gene track) are indistinguishable in the workflow editor.
which one is which ?
There's the very useful "name" attribute to each input dataset, but it is not set...
The only way to understand which one is which is to follow the workflow and see what's the next tool that's connected.
A novice galaxy user might not even notice this issue - and might mix-up the two input datasets.
A possible work-around (in the "extract workflow step") is to name each input dataset by the label of the dataset in the history (something like "Input data, based on UCSC KnownGenes dataset").
Second issue:
I have extracted the workflow and used it on the other two libraries.
But the generated datasets - oh boy. Making sense out of them is a real pain.
I have uploaded the library, imported the UCSC KnownGene track, and executed the workflow.
Take a look at the following history: http://main.g2.bx.psu.edu/history/imp?id=93de35e1ee121686 .
Can you tell me which datasets answers the question of how many of my reads intersect introns ? intergenic regions ? same-strand exons ?
(13, 15 and 22, respectively).
Using the workflow and trying to understand which dataset is which can take almost as long as just running everything from scratch without a workflow.
Third issue:
I personally created this workflow six days ago, by extracting it from a history that I've made.
Looking at it now, I have no way of adding/changing/updating it. It's almost "write once - read never".
http://main.g2.bx.psu.edu/workflow/imp?id=6e55935a2b1f3d59
I will need to carefully trace each chain of datasets and tools to understand it - that's really annoying (I realize nobody cares about my annoyance level. but in practice it means I won't use it and won't share it with others - it's not useful).
I don't have a good solution (this is rant, not a bugfix...), but I think that as galaxy goes into the Next-Gen sequencing arena,
this situation will become more common and problematic. This kind of basic analysis (taking a library and annotating it) is a very standard procedure - in our lab it is done almost automatically on every next-gen sequenced library. To do it in Galaxy, the process needs to be much simpler...
A small improvement would be to add two description labels for each tool in the workflow: one (detailed) will be shown inside the workflow editor, and the other to use a output label for the generated dataset.
Thanks for reading so far,
gordon.
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/d97f4e86be45
changeset: 2824:d97f4e86be45
user: Anton Nekrutenko <anton(a)bx.psu.edu>
date: Fri Oct 02 11:02:14 2009 -0400
description:
merge
0 file(s) affected in this change:
diffs (39 lines):
diff -r 2c0c81150dbd -r d97f4e86be45 tools/sr_mapping/bowtie_wrapper_code.py
--- a/tools/sr_mapping/bowtie_wrapper_code.py Fri Oct 02 11:00:30 2009 -0400
+++ b/tools/sr_mapping/bowtie_wrapper_code.py Fri Oct 02 11:02:14 2009 -0400
@@ -3,13 +3,14 @@
def exec_before_job(app, inp_data, out_data, param_dict, tool):
try:
refFile = param_dict['refGenomeSource']['indices'].value
- dbkey = os.path.split(refFile)[1].split('.')[0]
- # deal with the one odd case
- if dbkey.find('chrM') >= 0:
- dbkey = 'equCab2'
- out_data['output'].set_dbkey(dbkey)
except:
try:
refFile = param_dict['refGenomeSource']['ownFile'].dbkey
except:
- out_data['output'].set_dbkey('?')
+ out_data['output'].set_dbkey('?')
+ return
+ dbkey = os.path.split(refFile)[1].split('.')[0]
+ # deal with the one odd case
+ if dbkey.find('chrM') >= 0:
+ dbkey = 'equCab2'
+ out_data['output'].set_dbkey(dbkey)
diff -r 2c0c81150dbd -r d97f4e86be45 tools/sr_mapping/bwa_wrapper_code.py
--- a/tools/sr_mapping/bwa_wrapper_code.py Fri Oct 02 11:00:30 2009 -0400
+++ b/tools/sr_mapping/bwa_wrapper_code.py Fri Oct 02 11:02:14 2009 -0400
@@ -3,9 +3,10 @@
def exec_before_job(app, inp_data, out_data, param_dict, tool):
try:
refFile = param_dict['solidOrSolexa']['solidRefGenomeSource']['indices'].value
- out_data['output'].set_dbkey(os.path.split(refFile)[1].split('.')[0])
except:
try:
refFile = param_dict['solidOrSolexa']['solidRefGenomeSource']['ownFile'].dbkey
except:
out_data['output'].set_dbkey('?')
+ return
+ out_data['output'].set_dbkey(os.path.split(refFile)[1].split('.')[0])
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/65f28c1f1226
changeset: 2822:65f28c1f1226
user: Anton Nekrutenko <anton(a)bx.psu.edu>
date: Fri Oct 02 10:55:21 2009 -0400
description:
fix for trimmer interface
1 file(s) affected in this change:
tools/filters/trimmer.xml
diffs (13 lines):
diff -r 6eb4f3017fbe -r 65f28c1f1226 tools/filters/trimmer.xml
--- a/tools/filters/trimmer.xml Thu Oct 01 09:09:40 2009 -0400
+++ b/tools/filters/trimmer.xml Fri Oct 02 10:55:21 2009 -0400
@@ -9,8 +9,8 @@
<param name="start" type="integer" size="10" value="1" label="Trim from the beginning to this position" help="1 = do not trim the beginning"/>
<param name="end" type="integer" size="10" value="0" label="Remove everything from this position to the end" help="0 = do not trim the end"/>
<param name="fastq" type="select" label="Is input dataset in fastq format?" help="If set to YES, the tool will not trim evenly numbered lines (0, 2, 4, etc...)">
+ <option selected="true" value="">No</option>
<option value="-q">Yes</option>
- <option value="">No</option>
</param>
<param name="ignore" type="select" display="checkboxes" multiple="True" label="Ignore lines beginning with these characters" help="lines beginning with these are not trimmed">
<option value="62">></option>
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/2c0c81150dbd
changeset: 2823:2c0c81150dbd
user: Anton Nekrutenko <anton(a)bx.psu.edu>
date: Fri Oct 02 11:00:30 2009 -0400
description:
merge
0 file(s) affected in this change:
diffs (2008 lines):
diff -r 65f28c1f1226 -r 2c0c81150dbd eggs.ini
--- a/eggs.ini Fri Oct 02 10:55:21 2009 -0400
+++ b/eggs.ini Fri Oct 02 11:00:30 2009 -0400
@@ -23,6 +23,7 @@
python_lzo = 1.08
threadframe = 0.2
guppy = 0.1.8
+PSI = 0.3b1.1
[eggs:noplatform]
amqplib = 0.6.1
@@ -86,6 +87,7 @@
Paste = http://cheeseshop.python.org/packages/source/P/Paste/Paste-1.5.1.tar.gz
PasteDeploy = http://cheeseshop.python.org/packages/source/P/PasteDeploy/PasteDeploy-1.3.…
PasteScript = http://cheeseshop.python.org/packages/source/P/PasteScript/PasteScript-1.3.…
+PSI = http://pypi.python.org/packages/source/P/PSI/PSI-0.3b1.1.tar.gz
Routes = http://pypi.python.org/packages/source/R/Routes/Routes-1.6.3.tar.gz
simplejson = http://cheeseshop.python.org/packages/source/s/simplejson/simplejson-1.5.ta…
SQLAlchemy = http://pypi.python.org/packages/source/S/SQLAlchemy/SQLAlchemy-0.4.7p1.tar.…
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/config.py
--- a/lib/galaxy/config.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/config.py Fri Oct 02 11:00:30 2009 -0400
@@ -87,6 +87,7 @@
self.user_library_import_dir = kwargs.get( 'user_library_import_dir', None )
if self.user_library_import_dir is not None and not os.path.exists( self.user_library_import_dir ):
raise ConfigurationError( "user_library_import_dir specified in config (%s) does not exist" % self.user_library_import_dir )
+ self.allow_library_path_paste = kwargs.get( 'allow_library_path_paste', False )
# Configuration options for taking advantage of nginx features
self.nginx_x_accel_redirect_base = kwargs.get( 'nginx_x_accel_redirect_base', False )
self.nginx_upload_store = kwargs.get( 'nginx_upload_store', False )
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/tools/__init__.py Fri Oct 02 11:00:30 2009 -0400
@@ -668,7 +668,6 @@
# form when it changes
for name in param.get_dependencies():
context[ name ].refresh_on_change = True
- context[ name ].dependent_params.append( param.name )
return param
def check_workflow_compatible( self ):
@@ -804,7 +803,7 @@
else:
# Update state for all inputs on the current page taking new
# values from `incoming`.
- errors = self.update_state( trans, self.inputs_by_page[state.page], state.inputs, incoming, changed_dependencies={} )
+ errors = self.update_state( trans, self.inputs_by_page[state.page], state.inputs, incoming )
# If the tool provides a `validate_input` hook, call it.
validate_input = self.get_hook( 'validate_input' )
if validate_input:
@@ -880,8 +879,7 @@
return 'message.mako', dict( message_type='error', message='Your upload was interrupted. If this was uninentional, please retry it.', refresh_frames=[], cont=None )
def update_state( self, trans, inputs, state, incoming, prefix="", context=None,
- update_only=False, old_errors={}, changed_dependencies=None,
- item_callback=None ):
+ update_only=False, old_errors={}, item_callback=None ):
"""
Update the tool state in `state` using the user input in `incoming`.
This is designed to be called recursively: `inputs` contains the
@@ -891,18 +889,10 @@
If `update_only` is True, values that are not in `incoming` will
not be modified. In this case `old_errors` can be provided, and any
errors for parameters which were *not* updated will be preserved.
-
- Parameters in incoming that are 'dependency parameters' are those
- whose value is used by a dependent parameter to dynamically generate
- it's options list. When the value of these dependency parameters changes,
- the new value is stored in changed_dependencies.
"""
errors = dict()
# Push this level onto the context stack
context = ExpressionContext( state, context )
- # Initialize dict for changed dependencies (since we write to it)
- if changed_dependencies is None:
- changed_dependencies = {}
# Iterate inputs and update (recursively)
for input in inputs.itervalues():
key = prefix + input.name
@@ -938,7 +928,6 @@
context=context,
update_only=update_only,
old_errors=rep_old_errors,
- changed_dependencies=changed_dependencies,
item_callback=item_callback )
if rep_errors:
any_group_errors = True
@@ -999,7 +988,6 @@
context=context,
update_only=update_only,
old_errors=group_old_errors,
- changed_dependencies=changed_dependencies,
item_callback=item_callback )
if test_param_error:
group_errors[ input.test_param.name ] = test_param_error
@@ -1039,7 +1027,6 @@
context=context,
update_only=update_only,
old_errors=rep_old_errors,
- changed_dependencies=changed_dependencies,
item_callback=item_callback )
if rep_errors:
any_group_errors = True
@@ -1069,45 +1056,8 @@
if input.name in old_errors:
errors[ input.name ] = old_errors[ input.name ]
else:
- # FIXME: This is complicated and buggy.
- # SelectToolParameters and DataToolParameters whose options are dynamically
- # generated based on the current value of a dependency parameter require special
- # handling. When the dependency parameter's value is changed, the form is
- # submitted ( due to the refresh_on_change behavior ). When this occurs, the
- # "dependent" parameter's value has not been reset ( dynamically generated based
- # on the new value of its dependency ) prior to reaching this point, so we need
- # to regenerate it before it is validated in check_param().
- value_generated = False
- value = None
- if not( 'runtool_btn' in incoming or 'URL' in incoming ):
- # Form must have been refreshed, probably due to a refresh_on_change
- try:
- if input.is_dynamic:
- dependencies = input.get_dependencies()
- for dependency_name in dependencies:
- dependency_value = changed_dependencies.get( dependency_name, None )
- if dependency_value:
- # We need to dynamically generate the current input based on
- # the changed dependency parameter
- changed_params = {}
- changed_params[dependency_name] = dependency_value
- changed_params[input.name] = input
- value = input.get_initial_value( trans, changed_params )
- error = None
- value_generated = True
- # Delete the dependency_param from chagned_dependencies since its
- # dependent param has been generated based its new value.
- ## Actually, don't do this. What if there is more than one dependent?
- ## del changed_dependencies[dependency_name]
- break
- except:
- pass
- if not value_generated:
- incoming_value = get_incoming_value( incoming, key, None )
- value, error = check_param( trans, input, incoming_value, context )
- # Should we note a changed dependency?
- if input.dependent_params and state[ input.name ] != value:
- changed_dependencies[ input.name ] = value
+ incoming_value = get_incoming_value( incoming, key, None )
+ value, error = check_param( trans, input, incoming_value, context )
# If a callback was provided, allow it to process the value
if item_callback:
old_value = state.get( input.name, None )
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/tools/actions/upload_common.py
--- a/lib/galaxy/tools/actions/upload_common.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/tools/actions/upload_common.py Fri Oct 02 11:00:30 2009 -0400
@@ -3,6 +3,7 @@
from galaxy import datatypes, util
from galaxy.datatypes import sniff
from galaxy.util.json import to_json_string
+from galaxy.model.orm import eagerload_all
import logging
log = logging.getLogger( __name__ )
@@ -127,12 +128,29 @@
or trans.user.email in trans.app.config.get( "admin_users", "" ).split( "," ) ):
# This doesn't have to be pretty - the only time this should happen is if someone's being malicious.
raise Exception( "User is not authorized to add datasets to this library." )
+ folder = library_bunch.folder
+ if uploaded_dataset.get( 'in_folder', False ):
+ # Create subfolders if desired
+ for name in uploaded_dataset.in_folder.split( os.path.sep ):
+ folder.refresh()
+ matches = filter( lambda x: x.name == name, active_folders( trans, folder ) )
+ if matches:
+ log.debug( 'DEBUGDEBUG: In %s, found a folder name match: %s:%s' % ( folder.name, matches[0].id, matches[0].name ) )
+ folder = matches[0]
+ else:
+ new_folder = trans.app.model.LibraryFolder( name=name, description='Automatically created by upload tool' )
+ new_folder.genome_build = util.dbnames.default_value
+ folder.add_folder( new_folder )
+ new_folder.flush()
+ trans.app.security_agent.copy_library_permissions( folder, new_folder )
+ log.debug( 'DEBUGDEBUG: In %s, created a new folder: %s:%s' % ( folder.name, new_folder.id, new_folder.name ) )
+ folder = new_folder
if library_bunch.replace_dataset:
ld = library_bunch.replace_dataset
else:
- ld = trans.app.model.LibraryDataset( folder=library_bunch.folder, name=uploaded_dataset.name )
+ ld = trans.app.model.LibraryDataset( folder=folder, name=uploaded_dataset.name )
ld.flush()
- trans.app.security_agent.copy_library_permissions( library_bunch.folder, ld )
+ trans.app.security_agent.copy_library_permissions( folder, ld )
ldda = trans.app.model.LibraryDatasetDatasetAssociation( name = uploaded_dataset.name,
extension = uploaded_dataset.file_type,
dbkey = uploaded_dataset.dbkey,
@@ -153,8 +171,8 @@
else:
# Copy the current user's DefaultUserPermissions to the new LibraryDatasetDatasetAssociation.dataset
trans.app.security_agent.set_all_dataset_permissions( ldda.dataset, trans.app.security_agent.user_get_default_permissions( trans.user ) )
- library_bunch.folder.add_library_dataset( ld, genome_build=uploaded_dataset.dbkey )
- library_bunch.folder.flush()
+ folder.add_library_dataset( ld, genome_build=uploaded_dataset.dbkey )
+ folder.flush()
ld.library_dataset_dataset_association_id = ldda.id
ld.flush()
# Handle template included in the upload form, if any
@@ -230,6 +248,10 @@
is_binary = uploaded_dataset.datatype.is_binary
except:
is_binary = None
+ try:
+ link_data_only = uploaded_dataset.link_data_only
+ except:
+ link_data_only = False
json = dict( file_type = uploaded_dataset.file_type,
ext = uploaded_dataset.ext,
name = uploaded_dataset.name,
@@ -237,6 +259,7 @@
dbkey = uploaded_dataset.dbkey,
type = uploaded_dataset.type,
is_binary = is_binary,
+ link_data_only = link_data_only,
space_to_tab = uploaded_dataset.space_to_tab,
path = uploaded_dataset.path )
json_file.write( to_json_string( json ) + '\n' )
@@ -276,3 +299,13 @@
trans.app.job_queue.put( job.id, tool )
trans.log_event( "Added job to the job queue, id: %s" % str(job.id), tool_id=job.tool_id )
return dict( [ ( 'output%i' % i, v ) for i, v in enumerate( data_list ) ] )
+
+def active_folders( trans, folder ):
+ # Stolen from galaxy.web.controllers.library_common (importing from which causes a circular issues).
+ # Much faster way of retrieving all active sub-folders within a given folder than the
+ # performance of the mapper. This query also eagerloads the permissions on each folder.
+ return trans.sa_session.query( trans.app.model.LibraryFolder ) \
+ .filter_by( parent=folder, deleted=False ) \
+ .options( eagerload_all( "actions" ) ) \
+ .order_by( trans.app.model.LibraryFolder.table.c.name ) \
+ .all()
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/tools/parameters/basic.py Fri Oct 02 11:00:30 2009 -0400
@@ -32,7 +32,6 @@
self.html = "no html set"
self.repeat = param.get("repeat", None)
self.condition = param.get( "condition", None )
- self.dependent_params = []
self.validators = []
for elem in param.findall("validator"):
self.validators.append( validation.Validator.from_element( self, elem ) )
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/util/__init__.py
--- a/lib/galaxy/util/__init__.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/util/__init__.py Fri Oct 02 11:00:30 2009 -0400
@@ -178,7 +178,7 @@
# better solution I think is to more responsibility for
# sanitizing into the tool parameters themselves so that
# different parameters can be sanitized in different ways.
- NEVER_SANITIZE = ['file_data', 'url_paste', 'URL']
+ NEVER_SANITIZE = ['file_data', 'url_paste', 'URL', 'filesystem_paths']
def __init__( self, params, safe=True, sanitize=True, tool=None ):
if safe:
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/web/buildapp.py
--- a/lib/galaxy/web/buildapp.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/web/buildapp.py Fri Oct 02 11:00:30 2009 -0400
@@ -64,7 +64,12 @@
sys.exit( 1 )
atexit.register( app.shutdown )
# Create the universe WSGI application
- webapp = galaxy.web.framework.WebApplication( app, session_cookie='galaxysession' )
+ if app.config.log_memory_usage:
+ from galaxy.web.framework.memdebug import MemoryLoggingWebApplication
+ webapp = MemoryLoggingWebApplication( app, session_cookie='galaxysession' )
+ else:
+ webapp = galaxy.web.framework.WebApplication( app, session_cookie='galaxysession' )
+ # Find controllers
add_controllers( webapp, app )
# Force /history to go to /root/history -- needed since the tests assume this
webapp.add_route( '/history', controller='root', action='history' )
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/web/controllers/library.py
--- a/lib/galaxy/web/controllers/library.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/web/controllers/library.py Fri Oct 02 11:00:30 2009 -0400
@@ -568,64 +568,31 @@
msg=util.sanitize_text( msg ),
messagetype='error' ) )
lddas.append( ldda )
- if params.get( 'update_roles_button', False ):
- if trans.app.security_agent.can_manage_library_item( user, roles, ldda ) and \
- trans.app.security_agent.can_manage_dataset( roles, ldda.dataset ):
- permissions = {}
- for k, v in trans.app.model.Dataset.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
- permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
- for ldda in lddas:
- # Set the DATASET permissions on the Dataset
- trans.app.security_agent.set_all_dataset_permissions( ldda.dataset, permissions )
- ldda.dataset.refresh()
- permissions = {}
- for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
- permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
- for ldda in lddas:
- # Set the LIBRARY permissions on the LibraryDataset
- # NOTE: the LibraryDataset and LibraryDatasetDatasetAssociation will be set with the same permissions
- trans.app.security_agent.set_all_library_permissions( ldda.library_dataset, permissions )
- ldda.library_dataset.refresh()
- # Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
- trans.app.security_agent.set_all_library_permissions( ldda, permissions )
- ldda.refresh()
- msg = 'Permissions and roles have been updated on %d datasets' % len( lddas )
- messagetype = 'done'
- else:
- msg = "You are not authorized to change the permissions of dataset '%s'" % ldda.name
- messagetype = 'error'
- return trans.fill_template( "/library/ldda_permissions.mako",
- ldda=lddas,
- library_id=library_id,
- msg=msg,
- messagetype=messagetype )
+ if params.get( 'update_roles_button', False ):
if trans.app.security_agent.can_manage_library_item( user, roles, ldda ) and \
trans.app.security_agent.can_manage_dataset( roles, ldda.dataset ):
- # Ensure that the permissions across all library items are identical, otherwise we can't update them together.
- check_list = []
+ permissions = {}
+ for k, v in trans.app.model.Dataset.permitted_actions.items():
+ in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
+ permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
for ldda in lddas:
- permissions = []
- # Check the library level permissions - the permissions on the LibraryDatasetDatasetAssociation
- # will always be the same as the permissions on the associated LibraryDataset, so we only need to
- # check one Library object
- for library_permission in trans.app.security_agent.get_library_dataset_permissions( ldda.library_dataset ):
- if library_permission.action not in permissions:
- permissions.append( library_permission.action )
- for dataset_permission in trans.app.security_agent.get_dataset_permissions( ldda.dataset ):
- if dataset_permission.action not in permissions:
- permissions.append( dataset_permission.action )
- permissions.sort()
- if not check_list:
- check_list = permissions
- if permissions != check_list:
- msg = 'The datasets you selected do not have identical permissions, so they can not be updated together'
- trans.response.send_redirect( web.url_for( controller='library',
- action='browse_library',
- obj_id=library_id,
- msg=util.sanitize_text( msg ),
- messagetype='error' ) )
+ # Set the DATASET permissions on the Dataset
+ trans.app.security_agent.set_all_dataset_permissions( ldda.dataset, permissions )
+ ldda.dataset.refresh()
+ permissions = {}
+ for k, v in trans.app.model.Library.permitted_actions.items():
+ in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
+ permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
+ for ldda in lddas:
+ # Set the LIBRARY permissions on the LibraryDataset
+ # NOTE: the LibraryDataset and LibraryDatasetDatasetAssociation will be set with the same permissions
+ trans.app.security_agent.set_all_library_permissions( ldda.library_dataset, permissions )
+ ldda.library_dataset.refresh()
+ # Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
+ trans.app.security_agent.set_all_library_permissions( ldda, permissions )
+ ldda.refresh()
+ msg = 'Permissions and roles have been updated on %d datasets' % len( lddas )
+ messagetype = 'done'
else:
msg = "You are not authorized to change the permissions of dataset '%s'" % ldda.name
messagetype = 'error'
@@ -634,6 +601,39 @@
library_id=library_id,
msg=msg,
messagetype=messagetype )
+ if trans.app.security_agent.can_manage_library_item( user, roles, ldda ) and \
+ trans.app.security_agent.can_manage_dataset( roles, ldda.dataset ):
+ # Ensure that the permissions across all library items are identical, otherwise we can't update them together.
+ check_list = []
+ for ldda in lddas:
+ permissions = []
+ # Check the library level permissions - the permissions on the LibraryDatasetDatasetAssociation
+ # will always be the same as the permissions on the associated LibraryDataset, so we only need to
+ # check one Library object
+ for library_permission in trans.app.security_agent.get_library_dataset_permissions( ldda.library_dataset ):
+ if library_permission.action not in permissions:
+ permissions.append( library_permission.action )
+ for dataset_permission in trans.app.security_agent.get_dataset_permissions( ldda.dataset ):
+ if dataset_permission.action not in permissions:
+ permissions.append( dataset_permission.action )
+ permissions.sort()
+ if not check_list:
+ check_list = permissions
+ if permissions != check_list:
+ msg = 'The datasets you selected do not have identical permissions, so they can not be updated together'
+ trans.response.send_redirect( web.url_for( controller='library',
+ action='browse_library',
+ obj_id=library_id,
+ msg=util.sanitize_text( msg ),
+ messagetype='error' ) )
+ else:
+ msg = "You are not authorized to change the permissions of dataset '%s'" % ldda.name
+ messagetype = 'error'
+ return trans.fill_template( "/library/ldda_permissions.mako",
+ ldda=lddas,
+ library_id=library_id,
+ msg=msg,
+ messagetype=messagetype )
@web.expose
def upload_library_dataset( self, trans, library_id, folder_id, **kwd ):
params = util.Params( kwd )
@@ -652,8 +652,11 @@
replace_dataset = trans.app.model.LibraryDataset.get( params.get( 'replace_id', None ) )
if not last_used_build:
last_used_build = replace_dataset.library_dataset_dataset_association.dbkey
+ # Don't allow multiple datasets to be uploaded when replacing a dataset with a new version
+ upload_option = 'upload_file'
else:
replace_dataset = None
+ upload_option = params.get( 'upload_option', 'upload_file' )
user, roles = trans.get_user_and_roles()
if trans.app.security_agent.can_add_library_item( user, roles, folder ) or \
( replace_dataset and trans.app.security_agent.can_modify_library_item( user, roles, replace_dataset ) ):
@@ -666,15 +669,14 @@
else:
template_id = 'None'
widgets = []
- upload_option = params.get( 'upload_option', 'upload_file' )
created_outputs = trans.webapp.controllers[ 'library_common' ].upload_dataset( trans,
- controller='library',
- library_id=library_id,
- folder_id=folder_id,
- template_id=template_id,
- widgets=widgets,
- replace_dataset=replace_dataset,
- **kwd )
+ controller='library',
+ library_id=library_id,
+ folder_id=folder_id,
+ template_id=template_id,
+ widgets=widgets,
+ replace_dataset=replace_dataset,
+ **kwd )
if created_outputs:
ldda_id_list = [ str( v.id ) for v in created_outputs.values() ]
total_added = len( created_outputs.values() )
@@ -860,35 +862,6 @@
msg=msg,
messagetype=messagetype )
@web.expose
- def download_dataset_from_folder(self, trans, obj_id, library_id=None, **kwd):
- """Catches the dataset id and displays file contents as directed"""
- # id must refer to a LibraryDatasetDatasetAssociation object
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
- if not ldda.dataset:
- msg = 'Invalid LibraryDatasetDatasetAssociation id %s received for file downlaod' % str( obj_id )
- return trans.response.send_redirect( web.url_for( controller='library',
- action='browse_library',
- obj_id=library_id,
- msg=msg,
- messagetype='error' ) )
- mime = trans.app.datatypes_registry.get_mimetype_by_extension( ldda.extension.lower() )
- trans.response.set_content_type( mime )
- fStat = os.stat( ldda.file_name )
- trans.response.headers[ 'Content-Length' ] = int( fStat.st_size )
- valid_chars = '.,^_-()[]0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'
- fname = ldda.name
- fname = ''.join( c in valid_chars and c or '_' for c in fname )[ 0:150 ]
- trans.response.headers[ "Content-Disposition" ] = "attachment; filename=GalaxyLibraryDataset-%s-[%s]" % ( str( obj_id ), fname )
- try:
- return open( ldda.file_name )
- except:
- msg = 'This dataset contains no content'
- return trans.response.send_redirect( web.url_for( controller='library',
- action='browse_library',
- obj_id=library_id,
- msg=msg,
- messagetype='error' ) )
- @web.expose
def datasets( self, trans, library_id, ldda_ids='', **kwd ):
# This method is used by the select list labeled "Perform action on selected datasets"
# on the analysis library browser.
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/web/controllers/library_admin.py
--- a/lib/galaxy/web/controllers/library_admin.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/web/controllers/library_admin.py Fri Oct 02 11:00:30 2009 -0400
@@ -673,8 +673,11 @@
replace_dataset = trans.app.model.LibraryDataset.get( int( replace_id ) )
if not last_used_build:
last_used_build = replace_dataset.library_dataset_dataset_association.dbkey
+ # Don't allow multiple datasets to be uploaded when replacing a dataset with a new version
+ upload_option = 'upload_file'
else:
replace_dataset = None
+ upload_option = params.get( 'upload_option', 'upload_file' )
if params.get( 'runtool_btn', False ) or params.get( 'ajax_upload', False ):
# See if we have any inherited templates, but do not inherit contents.
info_association, inherited = folder.get_info_association( inherited=True )
@@ -684,15 +687,14 @@
else:
template_id = 'None'
widgets = []
- upload_option = params.get( 'upload_option', 'upload_file' )
created_outputs = trans.webapp.controllers[ 'library_common' ].upload_dataset( trans,
- controller='library_admin',
- library_id=library_id,
- folder_id=folder_id,
- template_id=template_id,
- widgets=widgets,
- replace_dataset=replace_dataset,
- **kwd )
+ controller='library_admin',
+ library_id=library_id,
+ folder_id=folder_id,
+ template_id=template_id,
+ widgets=widgets,
+ replace_dataset=replace_dataset,
+ **kwd )
if created_outputs:
total_added = len( created_outputs.values() )
if replace_dataset:
@@ -851,36 +853,6 @@
messagetype=messagetype )
@web.expose
@web.require_admin
- def download_dataset_from_folder(self, trans, obj_id, library_id=None, **kwd):
- """Catches the dataset id and displays file contents as directed"""
- # id must refer to a LibraryDatasetDatasetAssociation object
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
- if not ldda.dataset:
- msg = 'Invalid LibraryDatasetDatasetAssociation id %s received for file downlaod' % str( obj_id )
- return trans.response.send_redirect( web.url_for( controller='library_admin',
- action='browse_library',
- obj_id=library_id,
- msg=util.sanitize_text( msg ),
- messagetype='error' ) )
- mime = trans.app.datatypes_registry.get_mimetype_by_extension( ldda.extension.lower() )
- trans.response.set_content_type( mime )
- fStat = os.stat( ldda.file_name )
- trans.response.headers[ 'Content-Length' ] = int( fStat.st_size )
- valid_chars = '.,^_-()[]0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'
- fname = ldda.name
- fname = ''.join( c in valid_chars and c or '_' for c in fname )[ 0:150 ]
- trans.response.headers[ "Content-Disposition" ] = "attachment; filename=GalaxyLibraryDataset-%s-[%s]" % ( str( obj_id ), fname )
- try:
- return open( ldda.file_name )
- except:
- msg = 'This dataset contains no content'
- return trans.response.send_redirect( web.url_for( controller='library_admin',
- action='browse_library',
- obj_id=library_id,
- msg=util.sanitize_text( msg ),
- messagetype='error' ) )
- @web.expose
- @web.require_admin
def datasets( self, trans, library_id, **kwd ):
# This method is used by the select list labeled "Perform action on selected datasets"
# on the admin library browser.
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/web/controllers/library_common.py
--- a/lib/galaxy/web/controllers/library_common.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/web/controllers/library_common.py Fri Oct 02 11:00:30 2009 -0400
@@ -39,7 +39,7 @@
tool_id = 'upload1'
tool = trans.app.toolbox.tools_by_id[ tool_id ]
state = tool.new_state( trans )
- errors = tool.update_state( trans, tool.inputs_by_page[0], state.inputs, kwd, changed_dependencies={} )
+ errors = tool.update_state( trans, tool.inputs_by_page[0], state.inputs, kwd )
tool_params = state.inputs
dataset_upload_inputs = []
for input_name, input in tool.inputs.iteritems():
@@ -81,7 +81,9 @@
tool_params = upload_common.persist_uploads( tool_params )
uploaded_datasets = upload_common.get_uploaded_datasets( trans, tool_params, precreated_datasets, dataset_upload_inputs, library_bunch=library_bunch )
elif upload_option == 'upload_directory':
- uploaded_datasets = self.get_server_dir_uploaded_datasets( trans, params, full_dir, import_dir_desc, library_bunch, err_redirect, msg )
+ uploaded_datasets, err_redirect, msg = self.get_server_dir_uploaded_datasets( trans, params, full_dir, import_dir_desc, library_bunch, err_redirect, msg )
+ elif upload_option == 'upload_paths':
+ uploaded_datasets, err_redirect, msg = self.get_path_paste_uploaded_datasets( trans, params, library_bunch, err_redirect, msg )
upload_common.cleanup_unused_precreated_datasets( precreated_datasets )
if upload_option == 'upload_file' and not uploaded_datasets:
msg = 'Select a file, enter a URL or enter text'
@@ -98,37 +100,115 @@
json_file_path = upload_common.create_paramfile( uploaded_datasets )
data_list = [ ud.data for ud in uploaded_datasets ]
return upload_common.create_job( trans, tool_params, tool, json_file_path, data_list, folder=library_bunch.folder )
+ def make_library_uploaded_dataset( self, trans, params, name, path, type, library_bunch, in_folder=None ):
+ library_bunch.replace_dataset = None # not valid for these types of upload
+ uploaded_dataset = util.bunch.Bunch()
+ uploaded_dataset.name = name
+ uploaded_dataset.path = path
+ uploaded_dataset.type = type
+ uploaded_dataset.ext = None
+ uploaded_dataset.file_type = params.file_type
+ uploaded_dataset.dbkey = params.dbkey
+ uploaded_dataset.space_to_tab = params.space_to_tab
+ if in_folder:
+ uploaded_dataset.in_folder = in_folder
+ uploaded_dataset.data = upload_common.new_upload( trans, uploaded_dataset, library_bunch )
+ if params.get( 'link_data_only', False ):
+ uploaded_dataset.link_data_only = True
+ uploaded_dataset.data.file_name = os.path.abspath( path )
+ uploaded_dataset.data.flush()
+ return uploaded_dataset
def get_server_dir_uploaded_datasets( self, trans, params, full_dir, import_dir_desc, library_bunch, err_redirect, msg ):
files = []
try:
for entry in os.listdir( full_dir ):
# Only import regular files
- if os.path.isfile( os.path.join( full_dir, entry ) ):
- files.append( entry )
+ path = os.path.join( full_dir, entry )
+ if os.path.islink( path ) and os.path.isfile( path ) and params.get( 'link_data_only', False ):
+ # If we're linking instead of copying, link the file the link points to, not the link itself.
+ link_path = os.readlink( path )
+ if os.path.isabs( link_path ):
+ path = link_path
+ else:
+ path = os.path.abspath( os.path.join( os.path.dirname( path ), link_path ) )
+ if os.path.isfile( path ):
+ files.append( path )
except Exception, e:
msg = "Unable to get file list for configured %s, error: %s" % ( import_dir_desc, str( e ) )
err_redirect = True
- return None
+ return None, err_redirect, msg
if not files:
msg = "The directory '%s' contains no valid files" % full_dir
err_redirect = True
- return None
+ return None, err_redirect, msg
uploaded_datasets = []
for file in files:
- library_bunch.replace_dataset = None
- uploaded_dataset = util.bunch.Bunch()
- uploaded_dataset.path = os.path.join( full_dir, file )
- if not os.path.isfile( uploaded_dataset.path ):
+ name = os.path.basename( file )
+ uploaded_datasets.append( self.make_library_uploaded_dataset( trans, params, name, file, 'server_dir', library_bunch ) )
+ return uploaded_datasets, None, None
+ def get_path_paste_uploaded_datasets( self, trans, params, library_bunch, err_redirect, msg ):
+ if params.get( 'filesystem_paths', '' ) == '':
+ msg = "No paths entered in the upload form"
+ err_redirect = True
+ return None, err_redirect, msg
+ preserve_dirs = True
+ if params.get( 'dont_preserve_dirs', False ):
+ preserve_dirs = False
+ # locate files
+ bad_paths = []
+ uploaded_datasets = []
+ for line in [ l.strip() for l in params.filesystem_paths.splitlines() if l.strip() ]:
+ path = os.path.abspath( line )
+ if not os.path.exists( path ):
+ bad_paths.append( path )
continue
- uploaded_dataset.type = 'server_dir'
- uploaded_dataset.name = file
- uploaded_dataset.ext = None
- uploaded_dataset.file_type = params.file_type
- uploaded_dataset.dbkey = params.dbkey
- uploaded_dataset.space_to_tab = params.space_to_tab
- uploaded_dataset.data = upload_common.new_upload( trans, uploaded_dataset, library_bunch )
- uploaded_datasets.append( uploaded_dataset )
- return uploaded_datasets
+ # don't bother processing if we're just going to return an error
+ if not bad_paths:
+ if os.path.isfile( path ):
+ name = os.path.basename( path )
+ uploaded_datasets.append( self.make_library_uploaded_dataset( trans, params, name, path, 'path_paste', library_bunch ) )
+ for basedir, dirs, files in os.walk( line ):
+ for file in files:
+ file_path = os.path.abspath( os.path.join( basedir, file ) )
+ if preserve_dirs:
+ in_folder = os.path.dirname( file_path.replace( path, '', 1 ).lstrip( '/' ) )
+ else:
+ in_folder = None
+ uploaded_datasets.append( self.make_library_uploaded_dataset( trans, params, file, file_path, 'path_paste', library_bunch, in_folder ) )
+ if bad_paths:
+ msg = "Invalid paths:<br><ul><li>%s</li></ul>" % "</li><li>".join( bad_paths )
+ err_redirect = True
+ return None, err_redirect, msg
+ return uploaded_datasets, None, None
+ @web.expose
+ def download_dataset_from_folder( self, trans, cntrller, obj_id, library_id=None, **kwd ):
+ """Catches the dataset id and displays file contents as directed"""
+ # id must refer to a LibraryDatasetDatasetAssociation object
+ ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ if not ldda.dataset:
+ msg = 'Invalid LibraryDatasetDatasetAssociation id %s received for file downlaod' % str( obj_id )
+ return trans.response.send_redirect( web.url_for( controller=cntrller,
+ action='browse_library',
+ obj_id=library_id,
+ msg=util.sanitize_text( msg ),
+ messagetype='error' ) )
+ mime = trans.app.datatypes_registry.get_mimetype_by_extension( ldda.extension.lower() )
+ trans.response.set_content_type( mime )
+ fStat = os.stat( ldda.file_name )
+ trans.response.headers[ 'Content-Length' ] = int( fStat.st_size )
+ valid_chars = '.,^_-()[]0123456789abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'
+ fname = ldda.name
+ fname = ''.join( c in valid_chars and c or '_' for c in fname )[ 0:150 ]
+ trans.response.headers[ "Content-Disposition" ] = "attachment; filename=GalaxyLibraryDataset-%s-[%s]" % ( str( obj_id ), fname )
+ try:
+ return open( ldda.file_name )
+ except:
+ msg = 'This dataset contains no content'
+ return trans.response.send_redirect( web.url_for( controller=cntrller,
+ action='browse_library',
+ obj_id=library_id,
+ msg=util.sanitize_text( msg ),
+ messagetype='error' ) )
@web.expose
def info_template( self, trans, cntrller, library_id, response_action='library', obj_id=None, folder_id=None, ldda_id=None, **kwd ):
# Only adding a new templAte to a library or folder is currently allowed. Editing an existing template is
@@ -186,7 +266,7 @@
if cntrller == 'library_admin':
tmplt = '/admin/library/select_info_template.mako'
else:
- tmplt = '/ibrary/select_info_template.mako'
+ tmplt = '/library/select_info_template.mako'
return trans.fill_template( tmplt,
library_item_name=library_item.name,
library_item_desc=library_item_desc,
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/web/framework/base.py
--- a/lib/galaxy/web/framework/base.py Fri Oct 02 10:55:21 2009 -0400
+++ b/lib/galaxy/web/framework/base.py Fri Oct 02 11:00:30 2009 -0400
@@ -122,7 +122,7 @@
# Special key for AJAX debugging, remove to avoid confusing methods
kwargs.pop( '_', None )
try:
- body = method( trans, **kwargs )
+ body = self.call_body_method( method, trans, kwargs )
except Exception, e:
body = self.handle_controller_exception( e, trans, **kwargs )
if not body:
@@ -139,6 +139,9 @@
start_response( trans.response.wsgi_status(),
trans.response.wsgi_headeritems() )
return self.make_body_iterable( trans, body )
+
+ def call_body_method( self, method, trans, kwargs ):
+ return method( trans, **kwargs )
def make_body_iterable( self, trans, body ):
if isinstance( body, ( types.GeneratorType, list, tuple ) ):
diff -r 65f28c1f1226 -r 2c0c81150dbd lib/galaxy/web/framework/memdebug.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/lib/galaxy/web/framework/memdebug.py Fri Oct 02 11:00:30 2009 -0400
@@ -0,0 +1,26 @@
+"""
+Implementation of WebApplication that logs memory usage before and after
+calling each controller method.
+"""
+
+import pkg_resources
+pkg_resources.require( "PSI" )
+import psi.process
+
+import os
+import logging
+
+from galaxy.web.framework import WebApplication
+
+log = logging.getLogger( __name__ )
+pid = os.getpid()
+
+class MemoryLoggingWebApplication( WebApplication ):
+ def call_body_method( self, method, trans, kwargs ):
+ cls = method.im_class
+ process = psi.process.Process( pid )
+ log.debug( "before controller=%s.%s method=%s rss=%d vsz=%d", cls.__module__, cls.__name__, method.__name__, process.rss, process.vsz )
+ rval = method( trans, **kwargs )
+ process = psi.process.Process( pid )
+ log.debug( "after controller=%s.%s method=%s rss=%d vsz=%d", cls.__module__, cls.__name__, method.__name__, process.rss, process.vsz )
+ return rval
\ No newline at end of file
diff -r 65f28c1f1226 -r 2c0c81150dbd static/scripts/lrucache.js
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/static/scripts/lrucache.js Fri Oct 02 11:00:30 2009 -0400
@@ -0,0 +1,238 @@
+/*
+MIT LICENSE
+Copyright (c) 2007 Monsur Hossain (http://www.monsur.com)
+
+Permission is hereby granted, free of charge, to any person
+obtaining a copy of this software and associated documentation
+files (the "Software"), to deal in the Software without
+restriction, including without limitation the rights to use,
+copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the
+Software is furnished to do so, subject to the following
+conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
+OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
+HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
+WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
+OTHER DEALINGS IN THE SOFTWARE.
+*/
+
+// ****************************************************************************
+// CachePriority ENUM
+// An easier way to refer to the priority of a cache item
+var CachePriority = {
+ Low: 1,
+ Normal: 2,
+ High: 4
+}
+
+// ****************************************************************************
+// Cache constructor
+// Creates a new cache object
+// INPUT: maxSize (optional) - indicates how many items the cache can hold.
+// default is -1, which means no limit on the
+// number of items.
+function Cache(maxSize) {
+ this.items = {};
+ this.count = 0;
+ if (maxSize == null)
+ maxSize = -1;
+ this.maxSize = maxSize;
+ this.fillFactor = .75;
+ this.purgeSize = Math.round(this.maxSize * this.fillFactor);
+
+ this.stats = {}
+ this.stats.hits = 0;
+ this.stats.misses = 0;
+}
+
+// ****************************************************************************
+// Cache.getItem
+// retrieves an item from the cache, returns null if the item doesn't exist
+// or it is expired.
+// INPUT: key - the key to load from the cache
+Cache.prototype.getItem = function(key) {
+
+ // retrieve the item from the cache
+ var item = this.items[key];
+
+ if (item != null) {
+ if (!this._isExpired(item)) {
+ // if the item is not expired
+ // update its last accessed date
+ item.lastAccessed = new Date().getTime();
+ } else {
+ // if the item is expired, remove it from the cache
+ this._removeItem(key);
+ item = null;
+ }
+ }
+
+ // return the item value (if it exists), or null
+ var returnVal = null;
+ if (item != null) {
+ returnVal = item.value;
+ this.stats.hits++;
+ } else {
+ this.stats.misses++;
+ }
+ return returnVal;
+}
+
+// ****************************************************************************
+// Cache.setItem
+// sets an item in the cache
+// parameters: key - the key to refer to the object
+// value - the object to cache
+// options - an optional parameter described below
+// the last parameter accepts an object which controls various caching options:
+// expirationAbsolute: the datetime when the item should expire
+// expirationSliding: an integer representing the seconds since
+// the last cache access after which the item
+// should expire
+// priority: How important it is to leave this item in the cache.
+// You can use the values CachePriority.Low, .Normal, or
+// .High, or you can just use an integer. Note that
+// placing a priority on an item does not guarantee
+// it will remain in cache. It can still be purged if
+// an expiration is hit, or if the cache is full.
+// callback: A function that gets called when the item is purged
+// from cache. The key and value of the removed item
+// are passed as parameters to the callback function.
+Cache.prototype.setItem = function(key, value, options) {
+
+ function CacheItem(k, v, o) {
+ if ((k == null) || (k == ''))
+ throw new Error("key cannot be null or empty");
+ this.key = k;
+ this.value = v;
+ if (o == null)
+ o = {};
+ if (o.expirationAbsolute != null)
+ o.expirationAbsolute = o.expirationAbsolute.getTime();
+ if (o.priority == null)
+ o.priority = CachePriority.Normal;
+ this.options = o;
+ this.lastAccessed = new Date().getTime();
+ }
+
+ // add a new cache item to the cache
+ if (this.items[key] != null)
+ this._removeItem(key);
+ this._addItem(new CacheItem(key, value, options));
+
+ // if the cache is full, purge it
+ if ((this.maxSize > 0) && (this.count > this.maxSize)) {
+ this._purge();
+ }
+}
+
+// ****************************************************************************
+// Cache.clear
+// Remove all items from the cache
+Cache.prototype.clear = function() {
+
+ // loop through each item in the cache and remove it
+ for (var key in this.items) {
+ this._removeItem(key);
+ }
+}
+
+// ****************************************************************************
+// Cache._purge (PRIVATE FUNCTION)
+// remove old elements from the cache
+Cache.prototype._purge = function() {
+
+ var tmparray = new Array();
+
+ // loop through the cache, expire items that should be expired
+ // otherwise, add the item to an array
+ for (var key in this.items) {
+ var item = this.items[key];
+ if (this._isExpired(item)) {
+ this._removeItem(key);
+ } else {
+ tmparray.push(item);
+ }
+ }
+
+ if (tmparray.length > this.purgeSize) {
+
+ // sort this array based on cache priority and the last accessed date
+ tmparray = tmparray.sort(function(a, b) {
+ if (a.options.priority != b.options.priority) {
+ return b.options.priority - a.options.priority;
+ } else {
+ return b.lastAccessed - a.lastAccessed;
+ }
+ });
+
+ // remove items from the end of the array
+ while (tmparray.length > this.purgeSize) {
+ var ritem = tmparray.pop();
+ this._removeItem(ritem.key);
+ }
+ }
+}
+
+// ****************************************************************************
+// Cache._addItem (PRIVATE FUNCTION)
+// add an item to the cache
+Cache.prototype._addItem = function(item) {
+ this.items[item.key] = item;
+ this.count++;
+}
+
+// ****************************************************************************
+// Cache._removeItem (PRIVATE FUNCTION)
+// Remove an item from the cache, call the callback function (if necessary)
+Cache.prototype._removeItem = function(key) {
+ var item = this.items[key];
+ delete this.items[key];
+ this.count--;
+
+ // if there is a callback function, call it at the end of execution
+ if (item.options.callback != null) {
+ var callback = function() {
+ item.options.callback(item.key, item.value);
+ }
+ setTimeout(callback, 0);
+ }
+}
+
+// ****************************************************************************
+// Cache._isExpired (PRIVATE FUNCTION)
+// Returns true if the item should be expired based on its expiration options
+Cache.prototype._isExpired = function(item) {
+ var now = new Date().getTime();
+ var expired = false;
+ if ((item.options.expirationAbsolute) && (item.options.expirationAbsolute < now)) {
+ // if the absolute expiration has passed, expire the item
+ expired = true;
+ }
+ if ((expired == false) && (item.options.expirationSliding)) {
+ // if the sliding expiration has passed, expire the item
+ var lastAccess = item.lastAccessed + (item.options.expirationSliding * 1000);
+ if (lastAccess < now) {
+ expired = true;
+ }
+ }
+ return expired;
+}
+
+Cache.prototype.toHtmlString = function() {
+ var returnStr = this.count + " item(s) in cache<br /><ul>";
+ for (var key in this.items) {
+ var item = this.items[key];
+ returnStr = returnStr + "<li>" + item.key.toString() + " = " + item.value.toString() + "</li>";
+ }
+ returnStr = returnStr + "</ul>";
+ return returnStr;
+}
diff -r 65f28c1f1226 -r 2c0c81150dbd static/scripts/packed/lrucache.js
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/static/scripts/packed/lrucache.js Fri Oct 02 11:00:30 2009 -0400
@@ -0,0 +1,1 @@
+var CachePriority={Low:1,Normal:2,High:4};function Cache(a){this.items={};this.count=0;if(a==null){a=-1}this.maxSize=a;this.fillFactor=0.75;this.purgeSize=Math.round(this.maxSize*this.fillFactor);this.stats={};this.stats.hits=0;this.stats.misses=0}Cache.prototype.getItem=function(a){var c=this.items[a];if(c!=null){if(!this._isExpired(c)){c.lastAccessed=new Date().getTime()}else{this._removeItem(a);c=null}}var b=null;if(c!=null){b=c.value;this.stats.hits++}else{this.stats.misses++}return b};Cache.prototype.setItem=function(c,d,b){function a(f,e,g){if((f==null)||(f=="")){throw new Error("key cannot be null or empty")}this.key=f;this.value=e;if(g==null){g={}}if(g.expirationAbsolute!=null){g.expirationAbsolute=g.expirationAbsolute.getTime()}if(g.priority==null){g.priority=CachePriority.Normal}this.options=g;this.lastAccessed=new Date().getTime()}if(this.items[c]!=null){this._removeItem(c)}this._addItem(new a(c,d,b));if((this.maxSize>0)&&(this.count>this.maxSize)){this._purge()}}
;Cache.prototype.clear=function(){for(var a in this.items){this._removeItem(a)}};Cache.prototype._purge=function(){var d=new Array();for(var a in this.items){var b=this.items[a];if(this._isExpired(b)){this._removeItem(a)}else{d.push(b)}}if(d.length>this.purgeSize){d=d.sort(function(f,e){if(f.options.priority!=e.options.priority){return e.options.priority-f.options.priority}else{return e.lastAccessed-f.lastAccessed}});while(d.length>this.purgeSize){var c=d.pop();this._removeItem(c.key)}}};Cache.prototype._addItem=function(a){this.items[a.key]=a;this.count++};Cache.prototype._removeItem=function(a){var b=this.items[a];delete this.items[a];this.count--;if(b.options.callback!=null){var c=function(){b.options.callback(b.key,b.value)};setTimeout(c,0)}};Cache.prototype._isExpired=function(c){var a=new Date().getTime();var b=false;if((c.options.expirationAbsolute)&&(c.options.expirationAbsolute<a)){b=true}if((b==false)&&(c.options.expirationSliding)){var d=c.lastAccessed+(c.options.
expirationSliding*1000);if(d<a){b=true}}return b};Cache.prototype.toHtmlString=function(){var b=this.count+" item(s) in cache<br /><ul>";for(var a in this.items){var c=this.items[a];b=b+"<li>"+c.key.toString()+" = "+c.value.toString()+"</li>"}b=b+"</ul>";return b};
\ No newline at end of file
diff -r 65f28c1f1226 -r 2c0c81150dbd static/scripts/packed/trackster.js
--- a/static/scripts/packed/trackster.js Fri Oct 02 10:55:21 2009 -0400
+++ b/static/scripts/packed/trackster.js Fri Oct 02 11:00:30 2009 -0400
@@ -1,1 +1,1 @@
-var DENSITY=1000;var DataCache=function(b,a){this.type=b;this.track=a;this.cache=Object()};$.extend(DataCache.prototype,{get:function(d,b){var c=this.cache;if(!(c[d]&&c[d][b])){if(!c[d]){c[d]=Object()}var a=b*DENSITY*d;var e=(b+1)*DENSITY*d;c[d][b]={state:"loading"};$.getJSON(data_url,{track_type:this.track.track_type,chrom:this.track.view.chrom,low:a,high:e,dataset_id:this.track.dataset_id},function(f){if(f=="pending"){setTimeout(fetcher,5000)}else{c[d][b]={state:"loaded",values:f}}$(document).trigger("redraw")})}return c[d][b]}});var View=function(a,b){this.chrom=a;this.tracks=[];this.max_low=0;this.max_high=b;this.low=this.max_low;this.high=this.max_high;this.length=this.max_high-this.max_low};$.extend(View.prototype,{add_track:function(a){a.view=this;this.tracks.push(a);if(a.init){a.init()}},redraw:function(){$("#overview-box").css({left:(this.low/this.length)*$("#overview-viewport").width(),width:Math.max(4,((this.high-this.low)/this.length)*$("#overview-viewport").widt
h())}).show();$("#low").text(this.low);$("#high").text(this.high);for(var a in this.tracks){this.tracks[a].draw()}$("#bottom-spacer").remove();$("#viewport").append('<div id="bottom-spacer" style="height: 200px;"></div>')},move:function(b,a){this.low=Math.max(this.max_low,Math.floor(b));this.high=Math.min(this.length,Math.ceil(a))},zoom_in:function(d,b){if(this.max_high==0){return}var c=this.high-this.low;var e=c/d/2;if(b==undefined){var a=(this.low+this.high)/2}else{var a=this.low+c*b/$(document).width()}this.low=Math.floor(a-e);this.high=Math.ceil(a+e);if(this.low<this.max_low){this.low=this.max_low;this.high=c/d}else{if(this.high>this.max_high){this.high=this.max_high;this.low=this.max_high-c/d}}if(this.high-this.low<1){this.high=this.low+1}},zoom_out:function(c){if(this.max_high==0){return}var a=(this.low+this.high)/2;var b=this.high-this.low;var d=b*c/2;this.low=Math.floor(Math.max(0,a-d));this.high=Math.ceil(Math.min(this.length,a+d))},left:function(b){var a=this.high-
this.low;var c=Math.floor(a/b);if(this.low-c<0){this.low=0;this.high=this.low+a}else{this.low-=c;this.high-=c}},right:function(b){var a=this.high-this.low;var c=Math.floor(a/b);if(this.high+c>this.length){this.high=this.length;this.low=this.high-a}else{this.low+=c;this.high+=c}}});var Track=function(a,b){this.name=a;this.parent_element=b;this.make_container()};$.extend(Track.prototype,{make_container:function(){this.header_div=$("<div class='track-header'>").text(this.name);this.content_div=$("<div class='track-content'>");this.container_div=$("<div class='track'></div>").append(this.header_div).append(this.content_div);this.parent_element.append(this.container_div)}});var TiledTrack=function(){this.last_resolution=null;this.last_w_scale=null;this.tile_cache={}};$.extend(TiledTrack.prototype,Track.prototype,{draw:function(){var i=this.view.low,c=this.view.high,e=c-i;var b=Math.pow(10,Math.ceil(Math.log(e/DENSITY)/Math.log(10)));b=Math.max(b,1);b=Math.min(b,100000);var n=$("<
div style='position: relative;'></div>");this.content_div.children(":first").remove();this.content_div.append(n);var l=this.content_div.width(),d=this.content_div.height(),o=l/e,k={},m={};if(this.last_resolution==b&&this.last_w_scale==o){k=this.tile_cache}var g;var a=Math.floor(i/b/DENSITY);while((a*1000*b)<c){if(a in k){g=k[a];var f=a*DENSITY*b;g.css({left:(f-this.view.low)*o});n.append(g)}else{g=this.draw_tile(b,a,n,o,d)}if(g){m[a]=g}a+=1}this.last_resolution=b;this.last_w_scale=o;this.tile_cache=m}});var LabelTrack=function(a){Track.call(this,null,a);this.container_div.addClass("label-track")};$.extend(LabelTrack.prototype,Track.prototype,{draw:function(){var c=this.view,d=c.high-c.low,g=Math.floor(Math.pow(10,Math.floor(Math.log(d)/Math.log(10)))),a=Math.floor(c.low/g)*g,e=this.content_div.width(),b=$("<div style='position: relative; height: 1.3em;'></div>");while(a<c.high){var f=(a-c.low)/d*e;b.append($("<div class='label'>"+a+"</div>").css({position:"absolute",left:f-1
}));a+=g}this.content_div.children(":first").remove();this.content_div.append(b)}});var LineTrack=function(c,b,a){Track.call(this,c,$("#viewport"));this.track_type="line";this.height_px=(a?a:100);this.container_div.addClass("line-track");this.content_div.css("height",this.height_px+"px");this.dataset_id=b;this.cache=new DataCache("",this)};$.extend(LineTrack.prototype,TiledTrack.prototype,{init:function(){var a=this;$.getJSON(data_url,{stats:true,track_type:a.track_type,chrom:a.view.chrom,low:null,high:null,dataset_id:a.dataset_id},function(b){if(b){if(b=="error"){a.content_div.addClass("error").text("There was an error in indexing this dataset.")}else{if(b=="no data"){a.content_div.addClass("nodata").text("No data for this chrom/contig.")}else{a.min_value=b.min;a.max_value=b.max;a.vertical_range=a.max_value-a.min_value;a.view.redraw()}}}})},draw_tile:function(d,a,o,s,p){if(!this.vertical_range){return}var k=a*DENSITY*d,r=(a+1)*DENSITY*d,c=DENSITY*d;var n=this.cache.get(d,a)
;var h;if(n.state=="loading"){h=$("<div class='loading tile'></div>")}else{h=$("<canvas class='tile'></canvas>")}h.css({position:"absolute",top:0,left:(k-this.view.low)*s,});o.append(h);if(n.state=="loading"){e=false;return null}var b=h;b.get(0).width=Math.ceil(c*s);b.get(0).height=this.height_px;var q=b.get(0).getContext("2d");var e=false;q.beginPath();var g=n.values;if(!g){return}for(var f=0;f<g.length-1;f++){var m=g[f][0]-k;var l=g[f][1];if(isNaN(l)){e=false}else{m=m*s;y_above_min=l-this.min_value;l=y_above_min/this.vertical_range*this.height_px;if(e){q.lineTo(m,l)}else{q.moveTo(m,l);e=true}}}q.stroke();return h}});var FeatureTrack=function(c,b,a){Track.call(this,c,$("#viewport"));this.track_type="feature";this.height_px=(a?a:100);this.container_div.addClass("feature-track");this.content_div.css("height",this.height_px+"px");this.dataset_id=b;this.zo_slots={};this.show_labels_scale=0.01;this.showing_labels=false};$.extend(FeatureTrack.prototype,TiledTrack.prototype,{calc_
slots:function(e){var a=new Array();var d=this.container_div.width()/(this.view.high-this.view.low);if(e){this.zi_slots=new Object()}var c=$("<canvas></canvas>").get(0).getContext("2d");for(var b in this.values){feature=this.values[b];f_start=Math.floor(Math.max(this.view.max_low,(feature.start-this.view.max_low)*d));if(e){f_start-=c.measureText(feature.name).width}f_end=Math.ceil(Math.min(this.view.max_high,(feature.end-this.view.max_low)*d));j=0;while(true){if(a[j]==undefined||a[j]<f_start){a[j]=f_end;if(e){this.zi_slots[feature.name]=j}else{this.zo_slots[feature.name]=j}break}j++}}},init:function(){var a=this;$.getJSON(data_url,{track_type:a.track_type,low:a.view.max_low,high:a.view.max_high,dataset_id:a.dataset_id,chrom:a.view.chrom},function(b){a.values=b;a.calc_slots();a.slots=a.zo_slots;a.draw()})},draw_tile:function(q,t,e,g,f){if(!this.values){return null}if(g>this.show_labels_scale&&!this.showing_labels){this.showing_labels=true;if(!this.zi_slots){this.calc_slots(tr
ue)}this.slots=this.zi_slots}else{if(g<=this.show_labels_scale&&this.showing_labels){this.showing_labels=false;this.slots=this.zo_slots}}var u=t*DENSITY*q,c=(t+1)*DENSITY*q,b=DENSITY*q;var k=this.view,m=k.high-k.low,o=Math.ceil(b*g),h=new Array(),n=200,l=$("<canvas class='tile'></canvas>");l.css({position:"absolute",top:0,left:(u-this.view.low)*g,});l.get(0).width=o;l.get(0).height=n;var p=l.get(0).getContext("2d");var r=0;for(var s in this.values){feature=this.values[s];if(feature.start<=c&&feature.end>=u){f_start=Math.floor(Math.max(0,(feature.start-u)*g));f_end=Math.ceil(Math.min(o,(feature.end-u)*g));p.fillStyle="#000";p.fillRect(f_start,this.slots[feature.name]*10+5,f_end-f_start,1);if(this.showing_labels&&p.fillText){p.font="10px monospace";p.textAlign="right";p.fillText(feature.name,f_start,this.slots[feature.name]*10+8)}if(feature.exon_start&&feature.exon_end){var d=Math.floor(Math.max(0,(feature.exon_start-u)*g));var w=Math.ceil(Math.min(o,(feature.exon_end-u)*g))}f
or(var s in feature.blocks){block=feature.blocks[s];block_start=Math.floor(Math.max(0,(block[0]-u)*g));block_end=Math.ceil(Math.min(o,(block[1]-u)*g));var a=3,v=4;if(d&&block_start>=d&&block_end<=w){a=5,v=3}p.fillRect(d,this.slots[feature.name]*10+v,block_end-block_start,a)}r++}}e.append(l);return l},});
\ No newline at end of file
+var DENSITY=1000,DATA_ERROR="There was an error in indexing this dataset.",DATA_NONE="No data for this chrom/contig.";var DataCache=function(b,a){this.type=b;this.track=a;this.cache=Object()};$.extend(DataCache.prototype,{get:function(d,b){var c=this.cache;if(!(c[d]&&c[d][b])){if(!c[d]){c[d]=Object()}var a=b*DENSITY*d;var e=(b+1)*DENSITY*d;c[d][b]={state:"loading"};$.getJSON(data_url,{track_type:this.track.track_type,chrom:this.track.view.chrom,low:a,high:e,dataset_id:this.track.dataset_id},function(f){if(f=="pending"){setTimeout(fetcher,5000)}else{c[d][b]={state:"loaded",values:f}}$(document).trigger("redraw")})}return c[d][b]}});var View=function(a,b){this.chrom=a;this.tracks=[];this.max_low=0;this.max_high=b;this.low=this.max_low;this.high=this.max_high;this.length=this.max_high-this.max_low};$.extend(View.prototype,{add_track:function(a){a.view=this;this.tracks.push(a);if(a.init){a.init()}},redraw:function(){$("#overview-box").css({left:(this.low/this.length)*$("#overvie
w-viewport").width(),width:Math.max(4,((this.high-this.low)/this.length)*$("#overview-viewport").width())}).show();$("#low").text(this.low);$("#high").text(this.high);for(var a in this.tracks){this.tracks[a].draw()}$("#bottom-spacer").remove();$("#viewport").append('<div id="bottom-spacer" style="height: 200px;"></div>')},move:function(b,a){this.low=Math.max(this.max_low,Math.floor(b));this.high=Math.min(this.length,Math.ceil(a))},zoom_in:function(d,b){if(this.max_high==0){return}var c=this.high-this.low;var e=c/d/2;if(b==undefined){var a=(this.low+this.high)/2}else{var a=this.low+c*b/$(document).width()}this.low=Math.floor(a-e);this.high=Math.ceil(a+e);if(this.low<this.max_low){this.low=this.max_low;this.high=c/d}else{if(this.high>this.max_high){this.high=this.max_high;this.low=this.max_high-c/d}}if(this.high-this.low<1){this.high=this.low+1}},zoom_out:function(c){if(this.max_high==0){return}var a=(this.low+this.high)/2;var b=this.high-this.low;var d=b*c/2;this.low=Math.flo
or(Math.max(0,a-d));this.high=Math.ceil(Math.min(this.length,a+d))},left:function(b){var a=this.high-this.low;var c=Math.floor(a/b);if(this.low-c<0){this.low=0;this.high=this.low+a}else{this.low-=c;this.high-=c}},right:function(b){var a=this.high-this.low;var c=Math.floor(a/b);if(this.high+c>this.length){this.high=this.length;this.low=this.high-a}else{this.low+=c;this.high+=c}}});var Track=function(a,b){this.name=a;this.parent_element=b;this.make_container()};$.extend(Track.prototype,{make_container:function(){this.header_div=$("<div class='track-header'>").text(this.name);this.content_div=$("<div class='track-content'>");this.container_div=$("<div class='track'></div>").append(this.header_div).append(this.content_div);this.parent_element.append(this.container_div)}});var TiledTrack=function(){this.last_resolution=null;this.last_w_scale=null;this.tile_cache={}};$.extend(TiledTrack.prototype,Track.prototype,{draw:function(){var i=this.view.low,c=this.view.high,e=c-i;var b=Mat
h.pow(10,Math.ceil(Math.log(e/DENSITY)/Math.log(10)));b=Math.max(b,1);b=Math.min(b,100000);var m=$("<div style='position: relative;'></div>");this.content_div.children(":first").remove();this.content_div.append(m);var k=this.content_div.width(),d=this.content_div.height(),n=k/e,j={},l={};if(this.last_resolution==b&&this.last_w_scale==n){j=this.tile_cache}var g;var a=Math.floor(i/b/DENSITY);while((a*1000*b)<c){if(a in j){g=j[a];var f=a*DENSITY*b;g.css({left:(f-this.view.low)*n});m.append(g)}else{g=this.draw_tile(b,a,m,n,d)}if(g){l[a]=g}a+=1}this.last_resolution=b;this.last_w_scale=n;this.tile_cache=l}});var LabelTrack=function(a){Track.call(this,null,a);this.container_div.addClass("label-track")};$.extend(LabelTrack.prototype,Track.prototype,{draw:function(){var c=this.view,d=c.high-c.low,g=Math.floor(Math.pow(10,Math.floor(Math.log(d)/Math.log(10)))),a=Math.floor(c.low/g)*g,e=this.content_div.width(),b=$("<div style='position: relative; height: 1.3em;'></div>");while(a<c.hig
h){var f=(a-c.low)/d*e;b.append($("<div class='label'>"+a+"</div>").css({position:"absolute",left:f-1}));a+=g}this.content_div.children(":first").remove();this.content_div.append(b)}});var LineTrack=function(c,b,a){Track.call(this,c,$("#viewport"));this.track_type="line";this.height_px=(a?a:100);this.container_div.addClass("line-track");this.content_div.css("height",this.height_px+"px");this.dataset_id=b;this.cache=new DataCache("",this)};$.extend(LineTrack.prototype,TiledTrack.prototype,{init:function(){var a=this;$.getJSON(data_url,{stats:true,track_type:a.track_type,chrom:a.view.chrom,low:null,high:null,dataset_id:a.dataset_id},function(b){if(!b||b=="error"){a.content_div.addClass("error").text(DATA_ERROR)}else{if(b=="no data"){a.content_div.addClass("nodata").text(DATA_NONE)}else{a.min_value=b.min;a.max_value=b.max;a.vertical_range=a.max_value-a.min_value;a.view.redraw()}}})},draw_tile:function(d,a,n,r,o){if(!this.vertical_range){return}var j=a*DENSITY*d,q=(a+1)*DENSITY*
d,c=DENSITY*d;var m=this.cache.get(d,a);var h;if(m.state=="loading"){h=$("<div class='loading tile'></div>")}else{h=$("<canvas class='tile'></canvas>")}h.css({position:"absolute",top:0,left:(j-this.view.low)*r,});n.append(h);if(m.state=="loading"){e=false;return null}var b=h;b.get(0).width=Math.ceil(c*r);b.get(0).height=this.height_px;var p=b.get(0).getContext("2d");var e=false;p.beginPath();var g=m.values;if(!g){return}for(var f=0;f<g.length-1;f++){var l=g[f][0]-j;var k=g[f][1];if(isNaN(k)){e=false}else{l=l*r;y_above_min=k-this.min_value;k=y_above_min/this.vertical_range*this.height_px;if(e){p.lineTo(l,k)}else{p.moveTo(l,k);e=true}}}p.stroke();return h}});var FeatureTrack=function(c,b,a){Track.call(this,c,$("#viewport"));this.track_type="feature";this.height_px=(a?a:100);this.container_div.addClass("feature-track");this.content_div.css("height",this.height_px+"px");this.dataset_id=b;this.zo_slots={};this.show_labels_scale=0.01;this.showing_labels=false;this.vertical_gap=10}
;$.extend(FeatureTrack.prototype,TiledTrack.prototype,{calc_slots:function(h){var b=[];var a=this.container_div.width()/(this.view.high-this.view.low);if(h){this.zi_slots={}}var g=$("<canvas></canvas>").get(0).getContext("2d");for(var d in this.values){var k=this.values[d];var e=Math.floor(Math.max(this.view.max_low,(k.start-this.view.max_low)*a));if(h){e-=g.measureText(k.name).width;e-=10}var f=Math.ceil(Math.min(this.view.max_high,(k.end-this.view.max_low)*a));var c=0;while(true){if(b[c]==undefined||b[c]<e){b[c]=f;if(h){this.zi_slots[k.name]=c}else{this.zo_slots[k.name]=c}break}c++}}this.height_px=b.length*this.vertical_gap+15;this.content_div.css("height",this.height_px+"px")},init:function(){var a=this;$.getJSON(data_url,{track_type:a.track_type,low:a.view.max_low,high:a.view.max_high,dataset_id:a.dataset_id,chrom:a.view.chrom},function(b){if(b=="error"){a.content_div.addClass("error").text(DATA_ERROR)}else{if(b.length==0||b=="no data"){a.content_div.addClass("nodata").t
ext(DATA_NONE)}else{a.values=b;a.calc_slots();a.slots=a.zo_slots;a.draw()}}})},draw_tile:function(q,t,e,g,f){if(!this.values){return null}if(g>this.show_labels_scale&&!this.showing_labels){this.showing_labels=true;if(!this.zi_slots){this.calc_slots(true)}this.slots=this.zi_slots}else{if(g<=this.show_labels_scale&&this.showing_labels){this.showing_labels=false;this.slots=this.zo_slots}}var u=t*DENSITY*q,c=(t+1)*DENSITY*q,b=DENSITY*q;var k=this.view,m=k.high-k.low,o=Math.ceil(b*g),h=new Array(),n=this.height_px,l=$("<canvas class='tile'></canvas>");l.css({position:"absolute",top:0,left:(u-this.view.low)*g,});l.get(0).width=o;l.get(0).height=n;var p=l.get(0).getContext("2d");var r=0;for(var s in this.values){feature=this.values[s];if(feature.start<=c&&feature.end>=u){f_start=Math.floor(Math.max(0,(feature.start-u)*g));f_end=Math.ceil(Math.min(o,(feature.end-u)*g));p.fillStyle="#000";p.fillRect(f_start,this.slots[feature.name]*this.vertical_gap+5,f_end-f_start,1);if(this.showing
_labels&&p.fillText){p.font="10px monospace";p.textAlign="right";p.fillText(feature.name,f_start,this.slots[feature.name]*10+8)}if(feature.exon_start&&feature.exon_end){var d=Math.floor(Math.max(0,(feature.exon_start-u)*g));var w=Math.ceil(Math.min(o,(feature.exon_end-u)*g))}for(var s in feature.blocks){block=feature.blocks[s];block_start=Math.floor(Math.max(0,(block[0]-u)*g));block_end=Math.ceil(Math.min(o,(block[1]-u)*g));var a=3,v=4;if(d&&block_start>=d&&block_end<=w){a=5,v=3}p.fillRect(d,this.slots[feature.name]*this.vertical_gap+v,block_end-block_start,a)}r++}}e.append(l);return l},});
\ No newline at end of file
diff -r 65f28c1f1226 -r 2c0c81150dbd static/scripts/trackster.js
--- a/static/scripts/trackster.js Fri Oct 02 10:55:21 2009 -0400
+++ b/static/scripts/trackster.js Fri Oct 02 11:00:30 2009 -0400
@@ -2,7 +2,9 @@
2009, James Taylor, Kanwei Li
*/
-var DENSITY = 1000;
+var DENSITY = 1000,
+ DATA_ERROR = "There was an error in indexing this dataset.",
+ DATA_NONE = "No data for this chrom/contig.";
var DataCache = function( type, track ) {
this.type = type;
@@ -239,18 +241,16 @@
var track = this;
$.getJSON( data_url, { stats: true, track_type: track.track_type, chrom: track.view.chrom,
low: null, high: null, dataset_id: track.dataset_id }, function ( data ) {
- if (data) {
- if (data == "error") {
- track.content_div.addClass("error").text("There was an error in indexing this dataset.");
- } else if (data == "no data") {
- // console.log(track.content_div);
- track.content_div.addClass("nodata").text("No data for this chrom/contig.");
- } else {
- track.min_value = data['min'];
- track.max_value = data['max'];
- track.vertical_range = track.max_value - track.min_value;
- track.view.redraw();
- }
+ if (!data || data == "error") {
+ track.content_div.addClass("error").text(DATA_ERROR);
+ } else if (data == "no data") {
+ // console.log(track.content_div);
+ track.content_div.addClass("nodata").text(DATA_NONE);
+ } else {
+ track.min_value = data['min'];
+ track.max_value = data['max'];
+ track.vertical_range = track.max_value - track.min_value;
+ track.view.redraw();
}
});
},
@@ -321,27 +321,28 @@
this.zo_slots = {};
this.show_labels_scale = 0.01;
this.showing_labels = false;
+ this.vertical_gap = 10;
};
$.extend( FeatureTrack.prototype, TiledTrack.prototype, {
-
calc_slots: function( include_labels ) {
// console.log("num vals: " + this.values.length);
- var end_ary = new Array();
+ var end_ary = [];
var scale = this.container_div.width() / (this.view.high - this.view.low);
- // console.log(scale);
- if (include_labels) this.zi_slots = new Object();
+ // console.log(scale, this.view.high, this.view.low);
+ if (include_labels) this.zi_slots = {};
var dummy_canvas = $("<canvas></canvas>").get(0).getContext("2d");
for (var i in this.values) {
- feature = this.values[i];
- f_start = Math.floor( Math.max(this.view.max_low, (feature.start - this.view.max_low) * scale) );
+ var feature = this.values[i];
+ var f_start = Math.floor( Math.max(this.view.max_low, (feature.start - this.view.max_low) * scale) );
if (include_labels) {
f_start -= dummy_canvas.measureText(feature.name).width;
+ f_start -= 10; // Spacing between text and line
}
- f_end = Math.ceil( Math.min(this.view.max_high, (feature.end - this.view.max_low) * scale) );
+ var f_end = Math.ceil( Math.min(this.view.max_high, (feature.end - this.view.max_low) * scale) );
// if (include_labels) { console.log(f_start, f_end); }
- j = 0;
+ var j = 0;
while (true) {
if (end_ary[j] == undefined || end_ary[j] < f_start) {
end_ary[j] = f_end;
@@ -355,17 +356,26 @@
j++;
}
}
+ this.height_px = end_ary.length * this.vertical_gap + 15;
+ this.content_div.css( "height", this.height_px + "px" );
},
init: function() {
var track = this;
$.getJSON( data_url, { track_type: track.track_type, low: track.view.max_low, high: track.view.max_high,
dataset_id: track.dataset_id, chrom: track.view.chrom }, function ( data ) {
- track.values = data;
- track.calc_slots();
- track.slots = track.zo_slots;
- // console.log(track.zo_slots);
- track.draw();
+ if (data == "error") {
+ track.content_div.addClass("error").text(DATA_ERROR);
+ } else if (data.length == 0 || data == "no data") {
+ // console.log(track.content_div);
+ track.content_div.addClass("nodata").text(DATA_NONE);
+ } else {
+ track.values = data;
+ track.calc_slots();
+ track.slots = track.zo_slots;
+ // console.log(track.zo_slots);
+ track.draw();
+ }
});
},
@@ -391,7 +401,7 @@
range = view.high - view.low,
width = Math.ceil( tile_length * w_scale ),
slots = new Array(),
- height = 200,
+ height = this.height_px,
new_canvas = $("<canvas class='tile'></canvas>");
new_canvas.css({
@@ -412,7 +422,7 @@
f_end = Math.ceil( Math.min(width, (feature.end - tile_low) * w_scale) );
// console.log(feature.start, feature.end, f_start, f_end, j);
ctx.fillStyle = "#000";
- ctx.fillRect(f_start, this.slots[feature.name] * 10 + 5, f_end - f_start, 1);
+ ctx.fillRect(f_start, this.slots[feature.name] * this.vertical_gap + 5, f_end - f_start, 1);
if (this.showing_labels && ctx.fillText) {
ctx.font = "10px monospace";
@@ -435,7 +445,7 @@
if (exon_start && block_start >= exon_start && block_end <= exon_end) {
thickness = 5, y_start = 3;
}
- ctx.fillRect(exon_start, this.slots[feature.name] * 10 + y_start, block_end - block_start, thickness);
+ ctx.fillRect(exon_start, this.slots[feature.name] * this.vertical_gap + y_start, block_end - block_start, thickness);
// console.log(block_start, block_end);
}
diff -r 65f28c1f1226 -r 2c0c81150dbd static/trackster.css
--- a/static/trackster.css Fri Oct 02 10:55:21 2009 -0400
+++ b/static/trackster.css Fri Oct 02 11:00:30 2009 -0400
@@ -86,7 +86,7 @@
.track-content.error {
text-align: center;
padding-top: 30px;
- background-color: #600;
+ background-color: #ECB4AF;
}
.track-content.nodata {
text-align: center;
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/admin/library/browse_library.mako
--- a/templates/admin/library/browse_library.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/admin/library/browse_library.mako Fri Oct 02 11:00:30 2009 -0400
@@ -73,7 +73,7 @@
// Make ajax call
$.ajax( {
type: "POST",
- url: "${h.url_for( controller='library_dataset', action='library_item_updates' )}",
+ url: "${h.url_for( controller='library_common', action='library_item_updates' )}",
dataType: "json",
data: { ids: ids.join( "," ), states: states.join( "," ) },
success : function ( data ) {
@@ -137,7 +137,7 @@
<a class="action-button" href="${h.url_for( controller='library_admin', action='ldda_manage_permissions', library_id=library.id, folder_id=folder.id, obj_id=ldda.id, permissions=True )}">Edit this dataset's permissions</a>
<a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library.id, folder_id=folder.id, replace_id=library_dataset.id )}">Upload a new version of this dataset</a>
%if ldda.has_data:
- <a class="action-button" href="${h.url_for( controller='library_admin', action='download_dataset_from_folder', obj_id=ldda.id, library_id=library.id )}">Download this dataset</a>
+ <a class="action-button" href="${h.url_for( controller='library_admin', action='download_dataset_from_folder', cntrller='library_admin', obj_id=ldda.id, library_id=library.id )}">Download this dataset</a>
%endif
<a class="action-button" confirm="Click OK to delete dataset '${ldda.name}'." href="${h.url_for( controller='library_admin', action='delete_library_item', library_id=library.id, library_item_id=library_dataset.id, library_item_type='library_dataset' )}">Delete this dataset</a>
</div>
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/admin/library/ldda_info.mako
--- a/templates/admin/library/ldda_info.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/admin/library/ldda_info.mako Fri Oct 02 11:00:30 2009 -0400
@@ -47,7 +47,7 @@
<a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=ldda.library_dataset.folder.id, replace_id=ldda.library_dataset.id )}">Upload a new version of this dataset</a>
%endif
%if ldda.has_data:
- <a class="action-button" href="${h.url_for( controller='library_admin', action='download_dataset_from_folder', obj_id=ldda.id, library_id=library_id )}">Download this dataset</a>
+ <a class="action-button" href="${h.url_for( controller='library_admin', action='download_dataset_from_folder', cntrller='library_admin', obj_id=ldda.id, library_id=library_id )}">Download this dataset</a>
%endif
%if not library.deleted and not ldda.library_dataset.folder.deleted and not ldda.library_dataset.deleted:
<a class="action-button" confirm="Click OK to remove dataset '${ldda.name}'?" href="${h.url_for( controller='library_admin', action='delete_library_item', library_id=library_id, folder_id=ldda.library_dataset.folder.id, library_item_id=ldda.library_dataset.id, library_item_type='library_dataset' )}">Delete this dataset</a>
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/admin/library/upload.mako
--- a/templates/admin/library/upload.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/admin/library/upload.mako Fri Oct 02 11:00:30 2009 -0400
@@ -12,14 +12,20 @@
%>
<b>Create new data library datasets</b>
-<a id="upload-librarydataset--popup" class="popup-arrow" style="display: none;">▼</a>
-<div popupmenu="upload-librarydataset--popup">
- <a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_file' )}">Upload files</a>
- %if trans.app.config.library_import_dir and os.path.exists( trans.app.config.library_import_dir ):
- <a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_directory' )}">Upload directory of files</a>
- %endif
- <a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='import_from_history' )}">Import datasets from your current history</a>
-</div>
+%if replace_dataset in [ None, 'None' ]:
+ ## Don't allow multiple datasets to be uploaded when replacing a dataset with a new version
+ <a id="upload-librarydataset--popup" class="popup-arrow" style="display: none;">▼</a>
+ <div popupmenu="upload-librarydataset--popup">
+ <a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_file' )}">Upload files</a>
+ %if trans.app.config.library_import_dir and os.path.exists( trans.app.config.library_import_dir ):
+ <a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_directory' )}">Upload directory of files</a>
+ %endif
+ %if trans.app.config.allow_library_path_paste:
+ <a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_paths' )}">Upload files from filesystem paths</a>
+ %endif
+ <a class="action-button" href="${h.url_for( controller='library_admin', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='import_from_history' )}">Import datasets from your current history</a>
+ </div>
+%endif
<br/><br/>
<ul class="manage-table-actions">
<li>
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/base_panels.mako
--- a/templates/base_panels.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/base_panels.mako Fri Oct 02 11:00:30 2009 -0400
@@ -113,7 +113,7 @@
$(this).ajaxSubmit( { iframe: true } );
if ( $(this).find("input[name='folder_id']").val() != undefined ) {
var library_id = $(this).find("input[name='library_id']").val();
- if ( location.pathname.indexOf( 'library_admin' ) ) {
+ if ( location.pathname.indexOf( 'library_admin' ) != -1 ) {
$("iframe#galaxy_main").attr("src","${h.url_for( controller='library_admin', action='browse_library' )}?obj_id=" + library_id + "&created_ldda_ids=" + async_datasets);
} else {
$("iframe#galaxy_main").attr("src","${h.url_for( controller='library', action='browse_library' )}?obj_id=" + library_id + "&created_ldda_ids=" + async_datasets);
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/library/browse_library.mako
--- a/templates/library/browse_library.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/library/browse_library.mako Fri Oct 02 11:00:30 2009 -0400
@@ -105,7 +105,7 @@
// Make ajax call
$.ajax( {
type: "POST",
- url: "${h.url_for( controller='library_dataset', action='library_item_updates' )}",
+ url: "${h.url_for( controller='library_common', action='library_item_updates' )}",
dataType: "json",
data: { ids: ids.join( "," ), states: states.join( "," ) },
success : function ( data ) {
@@ -178,7 +178,7 @@
%endif
%if ldda.has_data:
<a class="action-button" href="${h.url_for( controller='library', action='datasets', library_id=library.id, ldda_ids=str( ldda.id ), do_action='add' )}">Import this dataset into your current history</a>
- <a class="action-button" href="${h.url_for( controller='library', action='download_dataset_from_folder', obj_id=ldda.id, library_id=library.id )}">Download this dataset</a>
+ <a class="action-button" href="${h.url_for( controller='library', action='download_dataset_from_folder', cntrller='library', obj_id=ldda.id, library_id=library.id )}">Download this dataset</a>
%endif
</div>
</td>
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/library/ldda_info.mako
--- a/templates/library/ldda_info.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/library/ldda_info.mako Fri Oct 02 11:00:30 2009 -0400
@@ -53,7 +53,7 @@
%endif
%if ldda.has_data:
<a class="action-button" href="${h.url_for( controller='library', action='datasets', library_id=library_id, ldda_ids=str( ldda.id ), do_action='add' )}">Import this dataset into your current history</a>
- <a class="action-button" href="${h.url_for( controller='library', action='download_dataset_from_folder', obj_id=ldda.id, library_id=library_id )}">Download this dataset</a>
+ <a class="action-button" href="${h.url_for( controller='library', action='download_dataset_from_folder', cntrller='library', obj_id=ldda.id, library_id=library_id )}">Download this dataset</a>
%endif
</div>
</div>
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/library/library_dataset_common.mako
--- a/templates/library/library_dataset_common.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/library/library_dataset_common.mako Fri Oct 02 11:00:30 2009 -0400
@@ -1,11 +1,13 @@
<%def name="render_upload_form( controller, upload_option, action, library_id, folder_id, replace_dataset, file_formats, dbkeys, roles, history )">
<% import os, os.path %>
- %if upload_option in [ 'upload_file', 'upload_directory' ]:
+ %if upload_option in [ 'upload_file', 'upload_directory', 'upload_paths' ]:
<div class="toolForm" id="upload_library_dataset">
- %if upload_option == 'upload_file':
+ %if upload_option == 'upload_directory':
+ <div class="toolFormTitle">Upload a directory of files</div>
+ %elif upload_option == 'upload_paths':
+ <div class="toolFormTitle">Upload files from filesystem paths</div>
+ %else:
<div class="toolFormTitle">Upload files</div>
- %else:
- <div class="toolFormTitle">Upload a directory of files</div>
%endif
<div class="toolFormBody">
<form name="upload_library_dataset" action="${action}" enctype="multipart/form-data" method="post">
@@ -103,6 +105,44 @@
%endif
</div>
<div style="clear: both"></div>
+ </div>
+ %elif upload_option == 'upload_paths':
+ <div class="form-row">
+ <label>Paths to upload</label>
+ <div class="form-row-input">
+ <textarea name="filesystem_paths" rows="10" cols="35"></textarea>
+ </div>
+ <div class="toolParamHelp" style="clear: both;">
+ Upload all files pasted in the box. The (recursive) contents of any pasted directories will be added as well.
+ </div>
+ </div>
+ <div class="form-row">
+ <label>Preserve directory structure?</label>
+ <div class="form-row-input">
+ <input type="checkbox" name="dont_preserve_dirs" value="No"/>No
+ </div>
+ <div class="toolParamHelp" style="clear: both;">
+ If checked, all files in subdirectories on the filesystem will be placed at the top level of the folder, instead of into subfolders.
+ </div>
+ </div>
+ %endif
+ %if upload_option in ( 'upload_directory', 'upload_paths' ):
+ <div class="form-row">
+ <label>Copy data into Galaxy?</label>
+ <div class="form-row-input">
+ <input type="checkbox" name="link_data_only" value="No"/>No
+ </div>
+ <div class="toolParamHelp" style="clear: both;">
+ Normally data uploaded with this tool is copied into Galaxy's "files" directory
+ so any later changes to the data will not affect Galaxy. However, this may not
+ be desired (especially for large NGS datasets), so use of this option will
+ force Galaxy to always read the data from its original path.
+ %if upload_option == 'upload_directory':
+ Any symlinks encountered in the upload directory will be dereferenced once -
+ that is, Galaxy will point directly to the file that is linked, but no other
+ symlinks further down the line will be dereferenced.
+ %endif
+ </div>
</div>
%endif
<div class="form-row">
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/library/upload.mako
--- a/templates/library/upload.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/library/upload.mako Fri Oct 02 11:00:30 2009 -0400
@@ -12,14 +12,17 @@
%>
<b>Create new data library datasets</b>
-<a id="upload-librarydataset--popup" class="popup-arrow" style="display: none;">▼</a>
-<div popupmenu="upload-librarydataset--popup">
- <a class="action-button" href="${h.url_for( controller='library', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_file' )}">Upload files</a>
- %if trans.app.config.user_library_import_dir and os.path.exists( os.path.join( trans.app.config.user_library_import_dir, trans.user.email ) ):
- <a class="action-button" href="${h.url_for( controller='library', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_directory' )}">Upload directory of files</a>
- %endif
- <a class="action-button" href="${h.url_for( controller='library', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='import_from_history' )}">Import datasets from your current history</a>
-</div>
+%if replace_dataset in [ None, 'None' ]:
+ ## Don't allow multiple datasets to be uploaded when replacing a dataset with a new version
+ <a id="upload-librarydataset--popup" class="popup-arrow" style="display: none;">▼</a>
+ <div popupmenu="upload-librarydataset--popup">
+ <a class="action-button" href="${h.url_for( controller='library', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_file' )}">Upload files</a>
+ %if trans.app.config.user_library_import_dir and os.path.exists( os.path.join( trans.app.config.user_library_import_dir, trans.user.email ) ):
+ <a class="action-button" href="${h.url_for( controller='library', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='upload_directory' )}">Upload directory of files</a>
+ %endif
+ <a class="action-button" href="${h.url_for( controller='library', action='upload_library_dataset', library_id=library_id, folder_id=folder_id, replace_id=replace_id, upload_option='import_from_history' )}">Import datasets from your current history</a>
+ </div>
+%endif
<br/><br/>
<ul class="manage-table-actions">
<li>
diff -r 65f28c1f1226 -r 2c0c81150dbd templates/workflow/run_complete.mako
--- a/templates/workflow/run_complete.mako Fri Oct 02 10:55:21 2009 -0400
+++ b/templates/workflow/run_complete.mako Fri Oct 02 11:00:30 2009 -0400
@@ -14,7 +14,7 @@
<body>
<div class="donemessage">
<p>
- Sucesfully ran workflow "${workflow.name}", the following datasets have
+ Successfully ran workflow "${workflow.name}", the following datasets have
been added to the queue.
</p>
@@ -28,4 +28,4 @@
</div>
</body>
-</html>
\ No newline at end of file
+</html>
diff -r 65f28c1f1226 -r 2c0c81150dbd test/base/twilltestcase.py
--- a/test/base/twilltestcase.py Fri Oct 02 10:55:21 2009 -0400
+++ b/test/base/twilltestcase.py Fri Oct 02 11:00:30 2009 -0400
@@ -1144,14 +1144,15 @@
tc.fv( "1", "2", description ) # form field 1 is the field named name...
tc.submit( "create_library_button" )
self.home()
- def set_library_permissions( self, library_id, library_name, role_id, permissions_in, permissions_out ):
+ def set_library_permissions( self, library_id, library_name, role_ids_str, permissions_in, permissions_out ):
+ # role_ids_str must be a comma-separated string of role ids
url = "library_admin/library?obj_id=%s&permissions=True&update_roles_button=Save" % ( library_id )
for po in permissions_out:
key = '%s_out' % po
- url ="%s&%s=%s" % ( url, key, str( role_id ) )
+ url ="%s&%s=%s" % ( url, key, role_ids_str )
for pi in permissions_in:
key = '%s_in' % pi
- url ="%s&%s=%s" % ( url, key, str( role_id ) )
+ url ="%s&%s=%s" % ( url, key, role_ids_str )
self.home()
self.visit_url( "%s/%s" % ( self.url, url ) )
check_str = "Permissions updated for library '%s'" % library_name
@@ -1208,12 +1209,12 @@
check_str = "Information template '%s' has been updated" % name
self.check_page_for_string( check_str )
self.home()
- def add_folder_info_template( self, cntrller, library_id, library_name, folder_id, folder_name, num_fields='2',
+ def add_folder_info_template( self, controller, cntrller, library_id, library_name, folder_id, folder_name, num_fields='2',
name='Folder Template 1', ele_name_0='Fu', ele_help_0='', ele_name_1='Bar', ele_help_1='' ):
"""Add a new info template to a folder"""
self.home()
- url = "%s/library_admin/info_template?cntrller=%s&library_id=%s&response_action='folder'&create_info_template_button=Go" % \
- ( self.url, cntrller, library_id, folder_id )
+ url = "%s/%s/info_template?cntrller=%s&library_id=%s&response_action='folder'&create_info_template_button=Go" % \
+ ( self.url, controller, cntrller, library_id, folder_id )
self.home()
self.visit_url( url )
check_str = "Create a new information template for folder '%s'" % folder_name
@@ -1228,14 +1229,27 @@
tc.fv( '1', 'new_element_description_1', ele_help_1.replace( '+', ' ' ) )
tc.submit( 'new_info_template_button' )
self.home()
- def add_folder( self, library_id, folder_id, name='Folder One', description='This is Folder One' ):
+ def add_folder( self, controller, library_id, folder_id, name='Folder One', description='This is Folder One' ):
"""Create a new folder"""
self.home()
- self.visit_url( "%s/library_admin/folder?library_id=%s&obj_id=%s&new=True" % ( self.url, library_id, folder_id ) )
+ self.visit_url( "%s/%s/folder?library_id=%s&obj_id=%s&new=True" % \
+ ( self.url, controller, library_id, folder_id ) )
self.check_page_for_string( 'Create a new folder' )
tc.fv( "1", "name", name ) # form field 1 is the field named name...
tc.fv( "1", "description", description ) # form field 2 is the field named description...
tc.submit( "new_folder_button" )
+ self.home()
+ def edit_folder_info( self, controller, folder_id, library_id, name, new_name, description ):
+ """Add information to a library using an existing template with 2 elements"""
+ self.home()
+ self.visit_url( "%s/%s/folder?obj_id=%s&library_id=%s&information=True" % \
+ ( self.url, controller, folder_id, library_id) )
+ self.check_page_for_string( "Edit folder name and description" )
+ tc.fv( '1', "name", new_name )
+ tc.fv( '1', "description", description )
+ tc.submit( 'rename_folder_button' )
+ check_str = "Folder '%s' has been renamed to '%s'" % ( name, new_name )
+ self.check_page_for_string( check_str )
self.home()
def rename_folder( self, library_id, folder_id, old_name, name='Folder One Renamed', description='This is Folder One Re-described' ):
"""Rename a Folder"""
@@ -1250,13 +1264,14 @@
check_str = "Folder '%s' has been renamed to '%s'" % ( old_name, name )
self.check_page_for_string( check_str )
self.home()
- def add_library_dataset( self, filename, library_id, folder_id, folder_name, file_type='auto',
- dbkey='hg18', roles=[], message='', root=False, template_field_name1='', template_field_contents1='' ):
+ def add_library_dataset( self, controller, filename, library_id, folder_id, folder_name,
+ file_type='auto', dbkey='hg18', roles=[], message='', root=False,
+ template_field_name1='', template_field_contents1='' ):
"""Add a dataset to a folder"""
filename = self.get_filename( filename )
self.home()
- self.visit_url( "%s/library_admin/upload_library_dataset?upload_option=upload_file&library_id=%s&folder_id=%s&message=%s" % \
- ( self.url, library_id, folder_id, message ) )
+ self.visit_url( "%s/%s/upload_library_dataset?upload_option=upload_file&library_id=%s&folder_id=%s&message=%s" % \
+ ( self.url, controller, library_id, folder_id, message ) )
self.check_page_for_string( 'Upload files' )
tc.fv( "1", "folder_id", folder_id )
tc.formfile( "1", "files_0|file_data", filename )
@@ -1273,20 +1288,19 @@
check_str = "Added 1 datasets to the library '%s' ( each is selected )." % folder_name
else:
check_str = "Added 1 datasets to the folder '%s' ( each is selected )." % folder_name
- self.check_page_for_string( check_str )
+ data = self.last_page()
self.library_wait( library_id )
self.home()
- def set_library_dataset_permissions( self, library_id, folder_id, ldda_id, ldda_name, role_id, permissions_in, permissions_out ):
+ def set_library_dataset_permissions( self, library_id, folder_id, ldda_id, ldda_name, role_ids_str, permissions_in, permissions_out ):
+ # role_ids_str must be a comma-separated string of role ids
url = "library_admin/ldda_manage_permissions?library_id=%s&folder_id=%s&obj_id=%s&update_roles_button=Save" % \
( library_id, folder_id, ldda_id )
- #role_ids = util.listify( role_ids )
- #for role_id in role_ids:
for po in permissions_out:
key = '%s_out' % po
- url ="%s&%s=%s" % ( url, key, str( role_id ) )
+ url ="%s&%s=%s" % ( url, key, role_ids_str )
for pi in permissions_in:
key = '%s_in' % pi
- url ="%s&%s=%s" % ( url, key, str( role_id ) )
+ url ="%s&%s=%s" % ( url, key, role_ids_str )
self.home()
self.visit_url( "%s/%s" % ( self.url, url ) )
check_str = "Permissions have been updated on 1 datasets"
diff -r 65f28c1f1226 -r 2c0c81150dbd test/functional/test_forms_and_requests.py
--- a/test/functional/test_forms_and_requests.py Fri Oct 02 10:55:21 2009 -0400
+++ b/test/functional/test_forms_and_requests.py Fri Oct 02 11:00:30 2009 -0400
@@ -156,7 +156,7 @@
# create a folder in the library
root_folder = library_one.root_folder
name = "Folder One"
- self.add_folder( str( library_one.id ), str( root_folder.id ), name=name, description='' )
+ self.add_folder( 'library_admin', str( library_one.id ), str( root_folder.id ), name=name, description='' )
global folder_one
folder_one = galaxy.model.LibraryFolder.filter( and_( galaxy.model.LibraryFolder.table.c.parent_id==root_folder.id,
galaxy.model.LibraryFolder.table.c.name==name ) ).first()
diff -r 65f28c1f1226 -r 2c0c81150dbd test/functional/test_security_and_libraries.py
--- a/test/functional/test_security_and_libraries.py Fri Oct 02 10:55:21 2009 -0400
+++ b/test/functional/test_security_and_libraries.py Fri Oct 02 11:00:30 2009 -0400
@@ -176,6 +176,7 @@
actions_in = [ 'manage permissions' ]
permissions_out = [ 'DATASET_ACCESS' ]
actions_out = [ 'access' ]
+ global regular_user2_private_role
regular_user2_private_role = None
for role in regular_user2.all_roles():
if role.name == regular_user2.email and role.description == 'Private Role for %s' % regular_user2.email:
@@ -539,7 +540,8 @@
message = 'Testing adding a public dataset to the root folder'
# The form_one template should be inherited to the library dataset upload form.
template_contents = "%s contents for root folder 1.bed" % form_one_field_label
- self.add_library_dataset( '1.bed',
+ self.add_library_dataset( 'library_admin',
+ '1.bed',
str( library_one.id ),
str( library_one.root_folder.id ),
library_one.root_folder.name,
@@ -593,7 +595,11 @@
root_folder = library_one.root_folder
name = "Root Folder's Folder One"
description = "This is the root folder's Folder One"
- self.add_folder( str( library_one.id ), str( root_folder.id ), name=name, description=description )
+ self.add_folder( 'library_admin',
+ str( library_one.id ),
+ str( root_folder.id ),
+ name=name,
+ description=description )
global folder_one
folder_one = galaxy.model.LibraryFolder.filter( and_( galaxy.model.LibraryFolder.table.c.parent_id==root_folder.id,
galaxy.model.LibraryFolder.table.c.name==name,
@@ -625,7 +631,7 @@
"""Testing adding a folder to a library folder"""
name = "Folder One's Subfolder"
description = "This is the Folder One's subfolder"
- self.add_folder( str( library_one.id ), str( folder_one.id ), name=name, description=description )
+ self.add_folder( 'library_admin', str( library_one.id ), str( folder_one.id ), name=name, description=description )
global subfolder_one
subfolder_one = galaxy.model.LibraryFolder.filter( and_( galaxy.model.LibraryFolder.table.c.parent_id==folder_one.id,
galaxy.model.LibraryFolder.table.c.name==name,
@@ -658,7 +664,7 @@
root_folder = library_one.root_folder
name = "Folder Two"
description = "This is the root folder's Folder Two"
- self.add_folder( str( library_one.id ), str( root_folder.id ), name=name, description=description )
+ self.add_folder( 'library_admin', str( library_one.id ), str( root_folder.id ), name=name, description=description )
global folder_two
folder_two = galaxy.model.LibraryFolder.filter( and_( galaxy.model.LibraryFolder.table.c.parent_id==root_folder.id,
galaxy.model.LibraryFolder.table.c.name==name,
@@ -686,7 +692,8 @@
message = "Testing adding a public dataset to the folder named %s" % folder_two.name
# The form_one template should be inherited to the library dataset upload form.
template_contents = "%s contents for %s 2.bed" % ( form_one_field_label, folder_two.name )
- self.add_library_dataset( '2.bed',
+ self.add_library_dataset( 'library_admin',
+ '2.bed',
str( library_one.id ),
str( folder_two.id ),
folder_two.name,
@@ -717,7 +724,8 @@
message = "Testing adding a 2nd public dataset to the folder named %s" % folder_two.name
# The form_one template should be inherited to the library dataset upload form.
template_contents = "%s contents for %s 3.bed" % ( form_one_field_label, folder_two.name )
- self.add_library_dataset( '3.bed',
+ self.add_library_dataset( 'library_admin',
+ '3.bed',
str( library_one.id ),
str( folder_two.id ),
folder_two.name,
@@ -759,7 +767,8 @@
message ='This is a test of the fourth dataset uploaded'
# The form_one template should be inherited to the library dataset upload form.
template_contents = "%s contents for %s 4.bed" % ( form_one_field_label, folder_one.name )
- self.add_library_dataset( '4.bed',
+ self.add_library_dataset( 'library_admin',
+ '4.bed',
str( library_one.id ),
str( folder_one.id ),
folder_one.name,
@@ -849,9 +858,9 @@
permissions_in = [ k for k, v in galaxy.model.Dataset.permitted_actions.items() ] + \
[ k for k, v in galaxy.model.Library.permitted_actions.items() ]
permissions_out = []
- role_ids = "%s,%s" % ( str( role_one.id ), str( admin_user_private_role.id ) )
+ role_ids_str = '%s,%s' % ( str( role_one.id ), str( admin_user_private_role.id ) )
self.set_library_dataset_permissions( str( library_one.id ), str( folder_one.id ), str( ldda_four.id ), ldda_four.name,
- role_ids, permissions_in, permissions_out )
+ role_ids_str, permissions_in, permissions_out )
# admin_user should now be able to see 4.bed from the analysis view's access libraries
self.home()
self.visit_url( '%s/library/browse_library?obj_id=%s' % ( self.url, str( library_one.id ) ) )
@@ -866,7 +875,8 @@
message = 'Testing adding a dataset with a role that is associated with a group and users'
# The form_one template should be inherited to the library dataset upload form.
template_contents = "%s contents for %s 5.bed" % ( form_one_field_label, folder_one.name )
- self.add_library_dataset( '5.bed',
+ self.add_library_dataset( 'library_admin',
+ '5.bed',
str( library_one.id ),
str( folder_one.id ),
folder_one.name,
@@ -1309,6 +1319,7 @@
def test_170_mark_group_deleted( self ):
"""Testing marking a group as deleted"""
+ # Logged in as admin_user
self.home()
self.visit_url( '%s/admin/groups' % self.url )
self.check_page_for_string( group_two.name )
@@ -1323,12 +1334,14 @@
raise AssertionError( '%s incorrectly lost all role associations when it was marked as deleted.' % group_two.name )
def test_175_undelete_group( self ):
"""Testing undeleting a deleted group"""
+ # Logged in as admin_user
self.undelete_group( str( group_two.id ), group_two.name )
group_two.refresh()
if group_two.deleted:
raise AssertionError( '%s was not correctly marked as not deleted.' % group_two.name )
def test_180_mark_role_deleted( self ):
"""Testing marking a role as deleted"""
+ # Logged in as admin_user
self.home()
self.visit_url( '%s/admin/roles' % self.url )
self.check_page_for_string( role_two.name )
@@ -1343,9 +1356,11 @@
raise AssertionError( '%s incorrectly lost all group associations when it was marked as deleted.' % role_two.name )
def test_185_undelete_role( self ):
"""Testing undeleting a deleted role"""
+ # Logged in as admin_user
self.undelete_role( str( role_two.id ), role_two.name )
def test_190_mark_dataset_deleted( self ):
"""Testing marking a library dataset as deleted"""
+ # Logged in as admin_user
self.home()
self.delete_library_item( str( library_one.id ), str( ldda_two.library_dataset.id ), ldda_two.name, library_item_type='library_dataset' )
self.home()
@@ -1359,12 +1374,14 @@
self.home()
def test_195_display_deleted_dataset( self ):
"""Testing displaying deleted dataset"""
+ # Logged in as admin_user
self.home()
self.visit_url( "%s/library_admin/browse_library?obj_id=%s&show_deleted=True" % ( self.url, str( library_one.id ) ) )
self.check_page_for_string( ldda_two.name )
self.home()
def test_200_hide_deleted_dataset( self ):
"""Testing hiding deleted dataset"""
+ # Logged in as admin_user
self.home()
self.visit_url( "%s/library_admin/browse_library?obj_id=%s&show_deleted=False" % ( self.url, str( library_one.id ) ) )
try:
@@ -1375,6 +1392,7 @@
self.home()
def test_205_mark_folder_deleted( self ):
"""Testing marking a library folder as deleted"""
+ # Logged in as admin_user
self.home()
self.delete_library_item( str( library_one.id ), str( folder_two.id ), folder_two.name, library_item_type='folder' )
self.home()
@@ -1387,6 +1405,7 @@
self.home()
def test_210_mark_folder_undeleted( self ):
"""Testing marking a library folder as undeleted"""
+ # Logged in as admin_user
self.home()
self.undelete_library_item( str( library_one.id ), str( folder_two.id ), folder_two.name, library_item_type='folder' )
self.home()
@@ -1402,6 +1421,7 @@
self.home()
def test_215_mark_library_deleted( self ):
"""Testing marking a library as deleted"""
+ # Logged in as admin_user
self.home()
# First mark folder_two as deleted to further test state saving when we undelete the library
self.delete_library_item( str( library_one.id ), str( folder_two.id ), folder_two.name, library_item_type='folder' )
@@ -1412,6 +1432,7 @@
self.home()
def test_220_mark_library_undeleted( self ):
"""Testing marking a library as undeleted"""
+ # Logged in as admin_user
self.home()
self.undelete_library_item( str( library_one.id ), str( library_one.id ), library_one.name, library_item_type='library' )
self.home()
@@ -1426,6 +1447,7 @@
self.home()
def test_225_purge_user( self ):
"""Testing purging a user account"""
+ # Logged in as admin_user
self.mark_user_deleted( user_id=self.security.encode_id( regular_user3.id ), email=regular_user3.email )
regular_user3.refresh()
self.purge_user( self.security.encode_id( regular_user3.id ), regular_user3.email )
@@ -1458,6 +1480,7 @@
raise AssertionError( 'UserRoleAssociations for user %s are not related with the private role.' % regular_user3.email )
def test_230_manually_unpurge_user( self ):
"""Testing manually un-purging a user account"""
+ # Logged in as admin_user
# Reset the user for later test runs. The user's private Role and DefaultUserPermissions for that role
# should have been preserved, so all we need to do is reset purged and deleted.
# TODO: If we decide to implement the GUI feature for un-purging a user, replace this with a method call
@@ -1466,6 +1489,7 @@
regular_user3.flush()
def test_235_purge_group( self ):
"""Testing purging a group"""
+ # Logged in as admin_user
group_id = str( group_two.id )
self.mark_group_deleted( group_id, group_two.name )
self.purge_group( group_id, group_two.name )
@@ -1481,6 +1505,7 @@
self.undelete_group( group_id, group_two.name )
def test_240_purge_role( self ):
"""Testing purging a role"""
+ # Logged in as admin_user
role_id = str( role_two.id )
self.mark_role_deleted( role_id, role_two.name )
self.purge_role( role_id, role_two.name )
@@ -1506,6 +1531,7 @@
raise AssertionError( "Purging the role did not delete the DatasetPermissionss for role_id '%s'" % role_id )
def test_245_manually_unpurge_role( self ):
"""Testing manually un-purging a role"""
+ # Logged in as admin_user
# Manually unpurge, then undelete the role for later test runs
# TODO: If we decide to implement the GUI feature for un-purging a role, replace this with a method call
role_two.purged = False
@@ -1513,6 +1539,7 @@
self.undelete_role( str( role_two.id ), role_two.name )
def test_250_purge_library( self ):
"""Testing purging a library"""
+ # Logged in as admin_user
self.home()
self.delete_library_item( str( library_one.id ), str( library_one.id ), library_one.name, library_item_type='library' )
self.purge_library( str( library_one.id ), library_one.name )
@@ -1549,6 +1576,7 @@
check_folder( library_one.root_folder )
def test_255_no_library_template( self ):
"""Test library features when library has no template"""
+ # Logged in as admin_user
name = "Library Two"
description = "This is Library Two"
# Create a library, adding no template
@@ -1561,7 +1589,8 @@
galaxy.model.Library.table.c.deleted==False ) ).first()
assert library_two is not None, 'Problem retrieving library named "%s" from the database' % name
# Add a dataset to the library
- self.add_library_dataset( '7.bed',
+ self.add_library_dataset( 'library_admin',
+ '7.bed',
str( library_two.id ),
str( library_two.root_folder.id ),
library_two.root_folder.name,
@@ -1585,6 +1614,7 @@
self.home()
def test_260_library_permissions( self ):
"""Test library permissions"""
+ # Logged in as admin_user
name = "Library Three"
description = "This is Library Three"
# Create a library, adding no template
@@ -1596,12 +1626,76 @@
galaxy.model.Library.table.c.description==description,
galaxy.model.Library.table.c.deleted==False ) ).first()
assert library_three is not None, 'Problem retrieving library named "%s" from the database' % name
- # TODO: add tests here...
- self.home()
+ # Set library permissions for regular_user1 and regular_user2. Each of these users will be permitted to
+ # LIBRARY_ADD, LIBRARY_MODIFY, LIBRARY_MANAGE for library items.
+ permissions_in = [ k for k, v in galaxy.model.Library.permitted_actions.items() ]
+ permissions_out = []
+ role_ids_str = '%s,%s' % ( str( regular_user1_private_role.id ), str( regular_user2_private_role.id ) )
+ self.set_library_permissions( str( library_three.id ), library_three.name, role_ids_str, permissions_in, permissions_out )
+ self.logout()
+ # Login as regular_user1 and make sure they can see the library
+ self.login( email=regular_user1.email )
+ self.visit_url( '%s/library/browse_libraries' % self.url )
+ self.check_page_for_string( name )
+ self.logout()
+ # Login as regular_user2 and make sure they can see the library
+ self.login( email=regular_user2.email )
+ self.visit_url( '%s/library/browse_libraries' % self.url )
+ self.check_page_for_string( name )
+ # Add a dataset to the library
+ message = 'Testing adding 1.bed to Library Three root folder'
+ self.add_library_dataset( 'library',
+ '1.bed',
+ str( library_three.id ),
+ str( library_three.root_folder.id ),
+ library_three.root_folder.name,
+ file_type='bed',
+ dbkey='hg18',
+ message=message.replace( ' ', '+' ),
+ root=True )
+ # Add a folder to the library
+ name = "Root Folder's Folder X"
+ description = "This is the root folder's Folder X"
+ self.add_folder( 'library',
+ str( library_three.id ),
+ str( library_three.root_folder.id ),
+ name=name,
+ description=description )
+ folder_x = galaxy.model.LibraryFolder.filter( and_( galaxy.model.LibraryFolder.table.c.parent_id==library_three.root_folder.id,
+ galaxy.model.LibraryFolder.table.c.name==name,
+ galaxy.model.LibraryFolder.table.c.description==description ) ).first()
+ # Modify the folder's information
+ new_name = "Root Folder's Folder Y"
+ new_description = "This is the root folder's Folder Y"
+ self.edit_folder_info( 'library', str( folder_x.id ), str( library_three.id ), name, new_name, new_description )
+ folder_x.refresh()
+ # Add a dataset to the folder
+ name2 = "Folder Y subfolder"
+ description2 = "Folder Y subfolder description"
+ self.add_library_dataset( 'library',
+ '2.bed',
+ str( library_three.id ),
+ str( folder_x.id ),
+ folder_x.name,
+ file_type='bed',
+ dbkey='hg18',
+ message=message.replace( ' ', '+' ),
+ root=False )
+ ldda_x = galaxy.model.LibraryDatasetDatasetAssociation.query() \
+ .order_by( desc( galaxy.model.LibraryDatasetDatasetAssociation.table.c.create_time ) ).first()
+ assert ldda_x is not None, 'Problem retrieving ldda_x from the database'
+ # Log in as regular_user1
+ self.logout()
+ self.login( email=regular_user1.email )
+ self.visit_url( '%s/library/browse_library?obj_id=%s' % ( self.url, str( library_three.id ) ) )
+ self.check_page_for_string( ldda_x.name )
+ self.logout()
+ self.login( email=admin_user.email )
self.delete_library_item( str( library_three.id ), str( library_three.id ), library_three.name, library_item_type='library' )
self.purge_library( str( library_three.id ), library_three.name )
def test_265_reset_data_for_later_test_runs( self ):
"""Reseting data to enable later test runs to pass"""
+ # Logged in as admin_user
##################
# Eliminate all non-private roles
##################
@@ -1633,11 +1727,11 @@
# Reset DefaultHistoryPermissions for regular_user1
#####################
self.logout()
- self.login( email='test1(a)bx.psu.edu' )
+ self.login( email=regular_user1.email )
# Change DefaultHistoryPermissions for regular_user1 back to the default
permissions_in = [ 'DATASET_MANAGE_PERMISSIONS' ]
permissions_out = [ 'DATASET_ACCESS' ]
role_id = str( regular_user1_private_role.id )
self.user_set_default_permissions( permissions_in=permissions_in, permissions_out=permissions_out, role_id=role_id )
self.logout()
- self.login( email='test(a)bx.psu.edu' )
+ self.login( email=admin_user.email )
diff -r 65f28c1f1226 -r 2c0c81150dbd tools/data_source/upload.py
--- a/tools/data_source/upload.py Fri Oct 02 10:55:21 2009 -0400
+++ b/tools/data_source/upload.py Fri Oct 02 11:00:30 2009 -0400
@@ -238,7 +238,9 @@
if ext == 'auto':
ext = 'data'
# Move the dataset to its "real" path
- if dataset.type == 'server_dir':
+ if dataset.get( 'link_data_only', False ):
+ pass # data will remain in place
+ elif dataset.type in ( 'server_dir', 'path_paste' ):
shutil.copy( dataset.path, output_path )
else:
shutil.move( dataset.path, output_path )
diff -r 65f28c1f1226 -r 2c0c81150dbd tools/samtools/sam_pileup.xml
--- a/tools/samtools/sam_pileup.xml Fri Oct 02 10:55:21 2009 -0400
+++ b/tools/samtools/sam_pileup.xml Fri Oct 02 11:00:30 2009 -0400
@@ -42,7 +42,7 @@
</param>
</when>
<when value="history">
- <param name="input1" type="data" format="sam, bam" label="Select the BAM file to generate the pileup file for" />
+ <param name="input1" type="data" format="bam" label="Select the BAM file to generate the pileup file for" />
<param name="ownFile" type="data" format="fasta" metadata_name="dbkey" label="Select a reference genome" />
</when>
</conditional>
diff -r 65f28c1f1226 -r 2c0c81150dbd tools/sr_mapping/bowtie_wrapper.xml
--- a/tools/sr_mapping/bowtie_wrapper.xml Fri Oct 02 10:55:21 2009 -0400
+++ b/tools/sr_mapping/bowtie_wrapper.xml Fri Oct 02 11:00:30 2009 -0400
@@ -152,7 +152,6 @@
<options from_file="bowtie_indices.loc">
<column name="value" index="1" />
<column name="name" index="0" />
- <filter type="sort_by" column="0" />
</options>
</param>
</when>
@@ -540,4 +539,5 @@
--seed <int> Random seed. Use <int> as the seed for the pseudo-random number generator. [off]
</help>
+ <code file="bowtie_wrapper_code.py" />
</tool>
diff -r 65f28c1f1226 -r 2c0c81150dbd tools/sr_mapping/bowtie_wrapper_code.py
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/tools/sr_mapping/bowtie_wrapper_code.py Fri Oct 02 11:00:30 2009 -0400
@@ -0,0 +1,15 @@
+import os
+
+def exec_before_job(app, inp_data, out_data, param_dict, tool):
+ try:
+ refFile = param_dict['refGenomeSource']['indices'].value
+ dbkey = os.path.split(refFile)[1].split('.')[0]
+ # deal with the one odd case
+ if dbkey.find('chrM') >= 0:
+ dbkey = 'equCab2'
+ out_data['output'].set_dbkey(dbkey)
+ except:
+ try:
+ refFile = param_dict['refGenomeSource']['ownFile'].dbkey
+ except:
+ out_data['output'].set_dbkey('?')
diff -r 65f28c1f1226 -r 2c0c81150dbd tools/sr_mapping/bwa_wrapper.xml
--- a/tools/sr_mapping/bwa_wrapper.xml Fri Oct 02 10:55:21 2009 -0400
+++ b/tools/sr_mapping/bwa_wrapper.xml Fri Oct 02 11:00:30 2009 -0400
@@ -80,7 +80,6 @@
<options from_file="sequence_index_color.loc">
<column name="value" index="1" />
<column name="name" index="0" />
- <filter type="sort_by" column="0" />
</options>
</param>
</when>
@@ -100,7 +99,6 @@
<options from_file="sequence_index_base.loc">
<column name="value" index="1" />
<column name="name" index="0" />
- <filter type="sort_by" column="0" />
</options>
</param>
</when>
diff -r 65f28c1f1226 -r 2c0c81150dbd tools/sr_mapping/bwa_wrapper_code.py
--- a/tools/sr_mapping/bwa_wrapper_code.py Fri Oct 02 10:55:21 2009 -0400
+++ b/tools/sr_mapping/bwa_wrapper_code.py Fri Oct 02 11:00:30 2009 -0400
@@ -4,5 +4,8 @@
try:
refFile = param_dict['solidOrSolexa']['solidRefGenomeSource']['indices'].value
out_data['output'].set_dbkey(os.path.split(refFile)[1].split('.')[0])
- except Exception, eq:
- out_data['output'].set_dbkey(param_dict['dbkey'])
+ except:
+ try:
+ refFile = param_dict['solidOrSolexa']['solidRefGenomeSource']['ownFile'].dbkey
+ except:
+ out_data['output'].set_dbkey('?')
diff -r 65f28c1f1226 -r 2c0c81150dbd universe_wsgi.ini.sample
--- a/universe_wsgi.ini.sample Fri Oct 02 10:55:21 2009 -0400
+++ b/universe_wsgi.ini.sample Fri Oct 02 11:00:30 2009 -0400
@@ -60,13 +60,22 @@
# Galaxy session security
id_secret = changethisinproductiontoo
-# Directories of files contained in the following directory can be uploaded to a library from the Admin view
+# Directories of files contained in the following directory can be uploaded to
+# a library from the Admin view
#library_import_dir = /var/opt/galaxy/import
-# The following can be configured to allow non-admin users to upload a directory of files. The
-# configured directory must contain sub-directories named the same as the non-admin user's Galaxy
-# login ( email ). The non-admin user is restricted to uploading files or sub-directories of files
-# contained in their directory.
-# user_library_import_dir = /var/opt/galaxy/import/users
+
+# The following can be configured to allow non-admin users to upload a
+# directory of files. The configured directory must contain sub-directories
+# named the same as the non-admin user's Galaxy login ( email ). The non-admin
+# user is restricted to uploading files or sub-directories of files contained
+# in their directory.
+#user_library_import_dir = /var/opt/galaxy/import/users
+
+# The admin library upload tool may contain a box allowing admins to paste
+# filesystem paths to files and directories to add to a library. Set to True
+# to enable. Please note the security implication that this will give Galaxy
+# Admins access to anything your Galaxy user has access to.
+#allow_library_path_paste = False
# path to sendmail
sendmail_path = /usr/sbin/sendmail
1
0
02 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/04a753865407
changeset: 2821:04a753865407
user: Kelly Vincent <kpvincent(a)bx.psu.edu>
date: Fri Oct 02 10:49:09 2009 -0400
description:
Corrected problem with assignment of dbkey to output from BWA and Bowtie tools
2 file(s) affected in this change:
tools/sr_mapping/bowtie_wrapper_code.py
tools/sr_mapping/bwa_wrapper_code.py
diffs (39 lines):
diff -r d98b6a559421 -r 04a753865407 tools/sr_mapping/bowtie_wrapper_code.py
--- a/tools/sr_mapping/bowtie_wrapper_code.py Fri Oct 02 09:40:49 2009 -0400
+++ b/tools/sr_mapping/bowtie_wrapper_code.py Fri Oct 02 10:49:09 2009 -0400
@@ -3,13 +3,14 @@
def exec_before_job(app, inp_data, out_data, param_dict, tool):
try:
refFile = param_dict['refGenomeSource']['indices'].value
- dbkey = os.path.split(refFile)[1].split('.')[0]
- # deal with the one odd case
- if dbkey.find('chrM') >= 0:
- dbkey = 'equCab2'
- out_data['output'].set_dbkey(dbkey)
except:
try:
refFile = param_dict['refGenomeSource']['ownFile'].dbkey
except:
- out_data['output'].set_dbkey('?')
+ out_data['output'].set_dbkey('?')
+ return
+ dbkey = os.path.split(refFile)[1].split('.')[0]
+ # deal with the one odd case
+ if dbkey.find('chrM') >= 0:
+ dbkey = 'equCab2'
+ out_data['output'].set_dbkey(dbkey)
diff -r d98b6a559421 -r 04a753865407 tools/sr_mapping/bwa_wrapper_code.py
--- a/tools/sr_mapping/bwa_wrapper_code.py Fri Oct 02 09:40:49 2009 -0400
+++ b/tools/sr_mapping/bwa_wrapper_code.py Fri Oct 02 10:49:09 2009 -0400
@@ -3,9 +3,10 @@
def exec_before_job(app, inp_data, out_data, param_dict, tool):
try:
refFile = param_dict['solidOrSolexa']['solidRefGenomeSource']['indices'].value
- out_data['output'].set_dbkey(os.path.split(refFile)[1].split('.')[0])
except:
try:
refFile = param_dict['solidOrSolexa']['solidRefGenomeSource']['ownFile'].dbkey
except:
out_data['output'].set_dbkey('?')
+ return
+ out_data['output'].set_dbkey(os.path.split(refFile)[1].split('.')[0])
1
0
02 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/d98b6a559421
changeset: 2820:d98b6a559421
user: Greg Von Kuster <greg(a)bx.psu.edu>
date: Fri Oct 02 09:40:49 2009 -0400
description:
Eliminate the changed_dependency code introduced in change set 1048 since it no longer seems to be necessary, and breaks certain tools.
3 file(s) affected in this change:
lib/galaxy/tools/__init__.py
lib/galaxy/tools/parameters/basic.py
lib/galaxy/web/controllers/library_common.py
diffs (144 lines):
diff -r a0a5be919102 -r d98b6a559421 lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py Fri Oct 02 09:37:57 2009 -0400
+++ b/lib/galaxy/tools/__init__.py Fri Oct 02 09:40:49 2009 -0400
@@ -668,7 +668,6 @@
# form when it changes
for name in param.get_dependencies():
context[ name ].refresh_on_change = True
- context[ name ].dependent_params.append( param.name )
return param
def check_workflow_compatible( self ):
@@ -804,7 +803,7 @@
else:
# Update state for all inputs on the current page taking new
# values from `incoming`.
- errors = self.update_state( trans, self.inputs_by_page[state.page], state.inputs, incoming, changed_dependencies={} )
+ errors = self.update_state( trans, self.inputs_by_page[state.page], state.inputs, incoming )
# If the tool provides a `validate_input` hook, call it.
validate_input = self.get_hook( 'validate_input' )
if validate_input:
@@ -880,8 +879,7 @@
return 'message.mako', dict( message_type='error', message='Your upload was interrupted. If this was uninentional, please retry it.', refresh_frames=[], cont=None )
def update_state( self, trans, inputs, state, incoming, prefix="", context=None,
- update_only=False, old_errors={}, changed_dependencies=None,
- item_callback=None ):
+ update_only=False, old_errors={}, item_callback=None ):
"""
Update the tool state in `state` using the user input in `incoming`.
This is designed to be called recursively: `inputs` contains the
@@ -891,18 +889,10 @@
If `update_only` is True, values that are not in `incoming` will
not be modified. In this case `old_errors` can be provided, and any
errors for parameters which were *not* updated will be preserved.
-
- Parameters in incoming that are 'dependency parameters' are those
- whose value is used by a dependent parameter to dynamically generate
- it's options list. When the value of these dependency parameters changes,
- the new value is stored in changed_dependencies.
"""
errors = dict()
# Push this level onto the context stack
context = ExpressionContext( state, context )
- # Initialize dict for changed dependencies (since we write to it)
- if changed_dependencies is None:
- changed_dependencies = {}
# Iterate inputs and update (recursively)
for input in inputs.itervalues():
key = prefix + input.name
@@ -938,7 +928,6 @@
context=context,
update_only=update_only,
old_errors=rep_old_errors,
- changed_dependencies=changed_dependencies,
item_callback=item_callback )
if rep_errors:
any_group_errors = True
@@ -999,7 +988,6 @@
context=context,
update_only=update_only,
old_errors=group_old_errors,
- changed_dependencies=changed_dependencies,
item_callback=item_callback )
if test_param_error:
group_errors[ input.test_param.name ] = test_param_error
@@ -1039,7 +1027,6 @@
context=context,
update_only=update_only,
old_errors=rep_old_errors,
- changed_dependencies=changed_dependencies,
item_callback=item_callback )
if rep_errors:
any_group_errors = True
@@ -1069,45 +1056,8 @@
if input.name in old_errors:
errors[ input.name ] = old_errors[ input.name ]
else:
- # FIXME: This is complicated and buggy.
- # SelectToolParameters and DataToolParameters whose options are dynamically
- # generated based on the current value of a dependency parameter require special
- # handling. When the dependency parameter's value is changed, the form is
- # submitted ( due to the refresh_on_change behavior ). When this occurs, the
- # "dependent" parameter's value has not been reset ( dynamically generated based
- # on the new value of its dependency ) prior to reaching this point, so we need
- # to regenerate it before it is validated in check_param().
- value_generated = False
- value = None
- if not( 'runtool_btn' in incoming or 'URL' in incoming ):
- # Form must have been refreshed, probably due to a refresh_on_change
- try:
- if input.is_dynamic:
- dependencies = input.get_dependencies()
- for dependency_name in dependencies:
- dependency_value = changed_dependencies.get( dependency_name, None )
- if dependency_value:
- # We need to dynamically generate the current input based on
- # the changed dependency parameter
- changed_params = {}
- changed_params[dependency_name] = dependency_value
- changed_params[input.name] = input
- value = input.get_initial_value( trans, changed_params )
- error = None
- value_generated = True
- # Delete the dependency_param from chagned_dependencies since its
- # dependent param has been generated based its new value.
- ## Actually, don't do this. What if there is more than one dependent?
- ## del changed_dependencies[dependency_name]
- break
- except:
- pass
- if not value_generated:
- incoming_value = get_incoming_value( incoming, key, None )
- value, error = check_param( trans, input, incoming_value, context )
- # Should we note a changed dependency?
- if input.dependent_params and state[ input.name ] != value:
- changed_dependencies[ input.name ] = value
+ incoming_value = get_incoming_value( incoming, key, None )
+ value, error = check_param( trans, input, incoming_value, context )
# If a callback was provided, allow it to process the value
if item_callback:
old_value = state.get( input.name, None )
diff -r a0a5be919102 -r d98b6a559421 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py Fri Oct 02 09:37:57 2009 -0400
+++ b/lib/galaxy/tools/parameters/basic.py Fri Oct 02 09:40:49 2009 -0400
@@ -32,7 +32,6 @@
self.html = "no html set"
self.repeat = param.get("repeat", None)
self.condition = param.get( "condition", None )
- self.dependent_params = []
self.validators = []
for elem in param.findall("validator"):
self.validators.append( validation.Validator.from_element( self, elem ) )
diff -r a0a5be919102 -r d98b6a559421 lib/galaxy/web/controllers/library_common.py
--- a/lib/galaxy/web/controllers/library_common.py Fri Oct 02 09:37:57 2009 -0400
+++ b/lib/galaxy/web/controllers/library_common.py Fri Oct 02 09:40:49 2009 -0400
@@ -39,7 +39,7 @@
tool_id = 'upload1'
tool = trans.app.toolbox.tools_by_id[ tool_id ]
state = tool.new_state( trans )
- errors = tool.update_state( trans, tool.inputs_by_page[0], state.inputs, kwd, changed_dependencies={} )
+ errors = tool.update_state( trans, tool.inputs_by_page[0], state.inputs, kwd )
tool_params = state.inputs
dataset_upload_inputs = []
for input_name, input in tool.inputs.iteritems():
1
0
02 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/a0a5be919102
changeset: 2819:a0a5be919102
user: Kelly Vincent <kpvincent(a)bx.psu.edu>
date: Fri Oct 02 09:37:57 2009 -0400
description:
Corrected Samtools Pileup tool to accept only BAM files as input, instead of also SAM
1 file(s) affected in this change:
tools/samtools/sam_pileup.xml
diffs (12 lines):
diff -r 72020a46127e -r a0a5be919102 tools/samtools/sam_pileup.xml
--- a/tools/samtools/sam_pileup.xml Fri Oct 02 09:36:07 2009 -0400
+++ b/tools/samtools/sam_pileup.xml Fri Oct 02 09:37:57 2009 -0400
@@ -42,7 +42,7 @@
</param>
</when>
<when value="history">
- <param name="input1" type="data" format="sam, bam" label="Select the BAM file to generate the pileup file for" />
+ <param name="input1" type="data" format="bam" label="Select the BAM file to generate the pileup file for" />
<param name="ownFile" type="data" format="fasta" metadata_name="dbkey" label="Select a reference genome" />
</when>
</conditional>
1
0