galaxy-dev
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2012 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2011 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2010 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2009 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2008 -----
- December
- November
- October
- September
- August
October 2009
- 18 participants
- 172 discussions
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/4448bc8ba587
changeset: 2911:4448bc8ba587
user: James Taylor <james(a)jamestaylor.org>
date: Fri Oct 23 12:53:25 2009 -0400
description:
Fix for SA session when track_jobs_in_database=False
1 file(s) affected in this change:
lib/galaxy/jobs/__init__.py
diffs (12 lines):
diff -r 811eda31b6b3 -r 4448bc8ba587 lib/galaxy/jobs/__init__.py
--- a/lib/galaxy/jobs/__init__.py Fri Oct 23 12:47:16 2009 -0400
+++ b/lib/galaxy/jobs/__init__.py Fri Oct 23 12:53:25 2009 -0400
@@ -172,7 +172,7 @@
# Pull all new jobs from the queue at once
new_jobs = []
if self.track_jobs_in_database:
- for j in session.query( model.Job ) \
+ for j in self.sa_session.query( model.Job ) \
.options( lazyload( "external_output_metadata" ), lazyload( "parameters" ) ) \
.filter( model.Job.c.state == model.Job.states.NEW ):
job = JobWrapper( j, self.app.toolbox.tools_by_id[ j.tool_id ], self )
1
0
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/2c4ed83f76ef
changeset: 2909:2c4ed83f76ef
user: Greg Von Kuster <greg(a)bx.psu.edu>
date: Thu Oct 22 23:02:28 2009 -0400
description:
Make sqlalchemy queries use sa_session.query( object ) rather than object.query(). This eliminates the need for the _monkeypatch_query_method() in assignmapper.py. Also eliminate the need for the _monkeypatch_session_method() for everything except object.flush() - I just ran out of time and will handle flush() asap. I also eliminated the .all() call on several queries where it was not necessary - should improve performance. I fixed several bugs I found as well.
62 file(s) affected in this change:
lib/galaxy/datatypes/metadata.py
lib/galaxy/jobs/__init__.py
lib/galaxy/model/__init__.py
lib/galaxy/model/mapping_tests.py
lib/galaxy/model/migrate/versions/0005_cleanup_datasets_fix.py
lib/galaxy/model/orm/ext/assignmapper.py
lib/galaxy/security/__init__.py
lib/galaxy/tags/tag_handler.py
lib/galaxy/tools/__init__.py
lib/galaxy/tools/actions/__init__.py
lib/galaxy/tools/actions/metadata.py
lib/galaxy/tools/actions/upload_common.py
lib/galaxy/tools/parameters/basic.py
lib/galaxy/web/controllers/admin.py
lib/galaxy/web/controllers/async.py
lib/galaxy/web/controllers/dataset.py
lib/galaxy/web/controllers/forms.py
lib/galaxy/web/controllers/genetrack.py
lib/galaxy/web/controllers/history.py
lib/galaxy/web/controllers/library.py
lib/galaxy/web/controllers/library_admin.py
lib/galaxy/web/controllers/library_common.py
lib/galaxy/web/controllers/mobile.py
lib/galaxy/web/controllers/page.py
lib/galaxy/web/controllers/requests.py
lib/galaxy/web/controllers/requests_admin.py
lib/galaxy/web/controllers/root.py
lib/galaxy/web/controllers/tag.py
lib/galaxy/web/controllers/tool_runner.py
lib/galaxy/web/controllers/tracks.py
lib/galaxy/web/controllers/user.py
lib/galaxy/web/controllers/workflow.py
lib/galaxy/web/framework/__init__.py
lib/galaxy/webapps/reports/controllers/users.py
scripts/cleanup_datasets/cleanup_datasets.py
templates/admin/dataset_security/deleted_groups.mako
templates/admin/dataset_security/deleted_roles.mako
templates/admin/dataset_security/groups.mako
templates/admin/dataset_security/roles.mako
templates/admin/library/folder_permissions.mako
templates/admin/library/ldda_info.mako
templates/admin/library/ldda_permissions.mako
templates/admin/library/library_dataset_permissions.mako
templates/admin/library/library_info.mako
templates/admin/library/library_permissions.mako
templates/admin/requests/grid.mako
templates/admin/requests/show_request.mako
templates/library/ldda_permissions.mako
templates/library/library_dataset_permissions.mako
templates/library/library_permissions.mako
templates/mobile/manage_library.mako
templates/requests/show_request.mako
templates/user/address.mako
test/base/twilltestcase.py
test/functional/test_DNAse_flanked_genes.py
test/functional/test_forms_and_requests.py
test/functional/test_get_data.py
test/functional/test_history_functions.py
test/functional/test_metadata_editing.py
test/functional/test_security_and_libraries.py
test/functional/test_sniffing_and_metadata_settings.py
test/functional/test_toolbox.py
diffs (truncated from 6168 to 3000 lines):
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/datatypes/metadata.py
--- a/lib/galaxy/datatypes/metadata.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/datatypes/metadata.py Thu Oct 22 23:02:28 2009 -0400
@@ -489,15 +489,22 @@
#We will use JSON as the medium of exchange of information, except for the DatasetInstance object which will use pickle (in the future this could be JSONified as well)
def __init__( self, job ):
self.job_id = job.id
- def get_output_filenames_by_dataset( self, dataset ):
+ def get_output_filenames_by_dataset( self, dataset, sa_session ):
if isinstance( dataset, galaxy.model.HistoryDatasetAssociation ):
- return galaxy.model.JobExternalOutputMetadata.filter_by( job_id = self.job_id, history_dataset_association_id = dataset.id ).first() #there should only be one or None
+ return sa_session.query( galaxy.model.JobExternalOutputMetadata ) \
+ .filter_by( job_id = self.job_id, history_dataset_association_id = dataset.id ) \
+ .first() #there should only be one or None
elif isinstance( dataset, galaxy.model.LibraryDatasetDatasetAssociation ):
- return galaxy.model.JobExternalOutputMetadata.filter_by( job_id = self.job_id, library_dataset_dataset_association_id = dataset.id ).first() #there should only be one or None
+ return sa_session.query( galaxy.model.JobExternalOutputMetadata ) \
+ .filter_by( job_id = self.job_id, library_dataset_dataset_association_id = dataset.id ) \
+ .first() #there should only be one or None
return None
def get_dataset_metadata_key( self, dataset ):
- return "%s_%d" % ( dataset.__class__.__name__, dataset.id ) #set meta can be called on library items and history items, need to make different keys for them, since ids can overlap
- def setup_external_metadata( self, datasets, exec_dir = None, tmp_dir = None, dataset_files_path = None, output_fnames = None, config_root = None, datatypes_config = None, kwds = {} ):
+ # Set meta can be called on library items and history items,
+ # need to make different keys for them, since ids can overlap
+ return "%s_%d" % ( dataset.__class__.__name__, dataset.id )
+ def setup_external_metadata( self, datasets, sa_session, exec_dir=None, tmp_dir=None, dataset_files_path=None,
+ output_fnames=None, config_root=None, datatypes_config=None, kwds={} ):
#fill in metadata_files_dict and return the command with args required to set metadata
def __metadata_files_list_to_cmd_line( metadata_files ):
def __get_filename_override():
@@ -527,7 +534,7 @@
#when setting metadata externally, via 'auto-detect' button in edit attributes, etc.,
#we don't want to overwrite (losing the ability to cleanup) our existing dataset keys and files,
#so we will only populate the dictionary once
- metadata_files = self.get_output_filenames_by_dataset( dataset )
+ metadata_files = self.get_output_filenames_by_dataset( dataset, sa_session )
if not metadata_files:
metadata_files = galaxy.model.JobExternalOutputMetadata( dataset = dataset)
metadata_files.job_id = self.job_id
@@ -553,8 +560,8 @@
#return command required to build
return "%s %s %s %s %s %s" % ( os.path.join( exec_dir, 'set_metadata.sh' ), dataset_files_path, tmp_dir, config_root, datatypes_config, " ".join( map( __metadata_files_list_to_cmd_line, metadata_files_list ) ) )
- def external_metadata_set_successfully( self, dataset ):
- metadata_files = self.get_output_filenames_by_dataset( dataset )
+ def external_metadata_set_successfully( self, dataset, sa_session ):
+ metadata_files = self.get_output_filenames_by_dataset( dataset, sa_session )
if not metadata_files:
return False # this file doesn't exist
rval, rstring = simplejson.load( open( metadata_files.filename_results_code ) )
@@ -578,4 +585,3 @@
for metadata_files in galaxy.model.Job.get( self.job_id ).external_output_metadata:
metadata_files.job_runner_external_pid = pid
metadata_files.flush()
-
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/jobs/__init__.py
--- a/lib/galaxy/jobs/__init__.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/jobs/__init__.py Thu Oct 22 23:02:28 2009 -0400
@@ -1,7 +1,6 @@
import logging, threading, sys, os, time, subprocess, string, tempfile, re, traceback, shutil
from galaxy import util, model
-from galaxy.model import mapping
from galaxy.model.orm import lazyload
from galaxy.datatypes.tabular import *
from galaxy.datatypes.interval import *
@@ -73,6 +72,7 @@
def __init__( self, app, dispatcher ):
"""Start the job manager"""
self.app = app
+ self.sa_session = app.model.context
# Should we read jobs form the database, or use an in memory queue
self.track_jobs_in_database = app.config.get_bool( 'track_jobs_in_database', False )
# Check if any special scheduling policy should be used. If not, default is FIFO.
@@ -126,14 +126,14 @@
job manager starts.
"""
model = self.app.model
- for job in model.Job.filter( model.Job.c.state==model.Job.states.NEW ).all():
+ for job in self.sa_session.query( model.Job ).filter( model.Job.state == model.Job.states.NEW ):
if job.tool_id not in self.app.toolbox.tools_by_id:
log.warning( "Tool '%s' removed from tool config, unable to recover job: %s" % ( job.tool_id, job.id ) )
JobWrapper( job, None, self ).fail( 'This tool was disabled before the job completed. Please contact your Galaxy administrator, or' )
else:
log.debug( "no runner: %s is still in new state, adding to the jobs queue" %job.id )
self.queue.put( ( job.id, job.tool_id ) )
- for job in model.Job.filter( (model.Job.c.state == model.Job.states.RUNNING) | (model.Job.c.state == model.Job.states.QUEUED) ).all():
+ for job in self.sa_session.query( model.Job ).filter( ( model.Job.state == model.Job.states.RUNNING ) | ( model.Job.state == model.Job.states.QUEUED ) ):
if job.tool_id not in self.app.toolbox.tools_by_id:
log.warning( "Tool '%s' removed from tool config, unable to recover job: %s" % ( job.tool_id, job.id ) )
JobWrapper( job, None, self ).fail( 'This tool was disabled before the job completed. Please contact your Galaxy administrator, or' )
@@ -169,12 +169,12 @@
it is marked as having errors and removed from the queue. Otherwise,
the job is dispatched.
"""
- # Get an orm session
- session = mapping.Session()
# Pull all new jobs from the queue at once
new_jobs = []
if self.track_jobs_in_database:
- for j in session.query( model.Job ).options( lazyload( "external_output_metadata" ), lazyload( "parameters" ) ).filter( model.Job.c.state == model.Job.states.NEW ).all():
+ for j in session.query( model.Job ) \
+ .options( lazyload( "external_output_metadata" ), lazyload( "parameters" ) ) \
+ .filter( model.Job.c.state == model.Job.states.NEW ):
job = JobWrapper( j, self.app.toolbox.tools_by_id[ j.tool_id ], self )
new_jobs.append( job )
else:
@@ -186,7 +186,7 @@
# Unpack the message
job_id, tool_id = message
# Create a job wrapper from it
- job_entity = session.query( model.Job ).get( job_id )
+ job_entity = self.sa_session.query( model.Job ).get( job_id )
job = JobWrapper( job_entity, self.app.toolbox.tools_by_id[ tool_id ], self )
# Append to watch queue
new_jobs.append( job )
@@ -199,11 +199,11 @@
try:
# Clear the session for each job so we get fresh states for
# job and all datasets
- session.clear()
+ self.sa_session.clear()
# Get the real job entity corresponding to the wrapper (if we
# are tracking in the database this is probably cached in
# the session from the origianl query above)
- job_entity = session.query( model.Job ).get( job.job_id )
+ job_entity = self.sa_session.query( model.Job ).get( job.job_id )
# Check the job's dependencies, requeue if they're not done
job_state = self.__check_if_ready_to_run( job, job_entity )
if job_state == JOB_WAIT:
@@ -254,7 +254,7 @@
job.fail( "failure running job %d: %s" % ( sjob.job_id, str( e ) ) )
log.exception( "failure running job %d" % sjob.job_id )
# Done with the session
- mapping.Session.remove()
+ self.sa_session.remove()
def __check_if_ready_to_run( self, job_wrapper, job ):
"""
@@ -319,6 +319,7 @@
self.tool = tool
self.queue = queue
self.app = queue.app
+ self.sa_session = self.app.model.context
self.extra_filenames = []
self.command_line = None
self.galaxy_lib_dir = None
@@ -335,7 +336,7 @@
"""
Restore the dictionary of parameters from the database.
"""
- job = model.Job.get( self.job_id )
+ job = self.sa_session.query( model.Job ).get( self.job_id )
param_dict = dict( [ ( p.name, p.value ) for p in job.parameters ] )
param_dict = self.tool.params_from_strings( param_dict, self.app )
return param_dict
@@ -345,11 +346,11 @@
Prepare the job to run by creating the working directory and the
config files.
"""
- mapping.context.current.clear() #this prevents the metadata reverting that has been seen in conjunction with the PBS job runner
+ self.sa_session.clear() #this prevents the metadata reverting that has been seen in conjunction with the PBS job runner
if not os.path.exists( self.working_directory ):
os.mkdir( self.working_directory )
# Restore parameters from the database
- job = model.Job.get( self.job_id )
+ job = self.sa_session.query( model.Job ).get( self.job_id )
incoming = dict( [ ( p.name, p.value ) for p in job.parameters ] )
incoming = self.tool.params_from_strings( incoming, self.app )
# Do any validation that could not be done at job creation
@@ -376,7 +377,7 @@
# Run the before queue ("exec_before_job") hook
self.tool.call_hook( 'exec_before_job', self.queue.app, inp_data=inp_data,
out_data=out_data, tool=self.tool, param_dict=incoming)
- mapping.context.current.flush()
+ self.sa_session.flush()
# Build any required config files
config_filenames = self.tool.build_config_files( param_dict, self.working_directory )
# FIXME: Build the param file (might return None, DEPRECATED)
@@ -403,8 +404,8 @@
Indicate job failure by setting state and message on all output
datasets.
"""
- job = model.Job.get( self.job_id )
- job.refresh()
+ job = self.sa_session.query( model.Job ).get( self.job_id )
+ self.sa_session.refresh( job )
# if the job was deleted, don't fail it
if not job.state == model.Job.states.DELETED:
# Check if the failure is due to an exception
@@ -427,7 +428,7 @@
log.error( "fail(): Missing output file in working directory: %s" % e )
for dataset_assoc in job.output_datasets + job.output_library_datasets:
dataset = dataset_assoc.dataset
- dataset.refresh()
+ self.sa_session.refresh( dataset )
dataset.state = dataset.states.ERROR
dataset.blurb = 'tool error'
dataset.info = message
@@ -443,11 +444,11 @@
self.cleanup()
def change_state( self, state, info = False ):
- job = model.Job.get( self.job_id )
- job.refresh()
+ job = self.sa_session.query( model.Job ).get( self.job_id )
+ self.sa_session.refresh( job )
for dataset_assoc in job.output_datasets + job.output_library_datasets:
dataset = dataset_assoc.dataset
- dataset.refresh()
+ self.sa_session.refresh( dataset )
dataset.state = state
if info:
dataset.info = info
@@ -458,13 +459,13 @@
job.flush()
def get_state( self ):
- job = model.Job.get( self.job_id )
- job.refresh()
+ job = self.sa_session.query( model.Job ).get( self.job_id )
+ self.sa_session.refresh( job )
return job.state
def set_runner( self, runner_url, external_id ):
- job = model.Job.get( self.job_id )
- job.refresh()
+ job = self.sa_session.query( model.Job ).get( self.job_id )
+ self.sa_session.refresh( job )
job.job_runner_name = runner_url
job.job_runner_external_id = external_id
job.flush()
@@ -476,8 +477,8 @@
the contents of the output files.
"""
# default post job setup
- mapping.context.current.clear()
- job = model.Job.get( self.job_id )
+ self.sa_session.clear()
+ job = self.sa_session.query( model.Job ).get( self.job_id )
# if the job was deleted, don't finish it
if job.state == job.states.DELETED:
self.cleanup()
@@ -523,7 +524,7 @@
#either use the metadata from originating output dataset, or call set_meta on the copies
#it would be quicker to just copy the metadata from the originating output dataset,
#but somewhat trickier (need to recurse up the copied_from tree), for now we'll call set_meta()
- if not self.external_output_metadata.external_metadata_set_successfully( dataset ):
+ if not self.external_output_metadata.external_metadata_set_successfully( dataset, self.sa_session ):
# Only set metadata values if they are missing...
dataset.set_meta( overwrite = False )
else:
@@ -563,7 +564,7 @@
# ERROR. The user will never see that the datasets are in error if
# they were flushed as OK here, since upon doing so, the history
# panel stops checking for updates. So allow the
- # mapping.context.current.flush() at the bottom of this method set
+ # self.sa_session.flush() at the bottom of this method set
# the state instead.
#dataset_assoc.dataset.dataset.flush()
@@ -596,7 +597,7 @@
# TODO
# validate output datasets
job.command_line = self.command_line
- mapping.context.current.flush()
+ self.sa_session.flush()
log.debug( 'job %d ended' % self.job_id )
self.cleanup()
@@ -619,7 +620,7 @@
return self.session_id
def get_input_fnames( self ):
- job = model.Job.get( self.job_id )
+ job = self.sa_session.query( model.Job ).get( self.job_id )
filenames = []
for da in job.input_datasets: #da is JobToInputDatasetAssociation object
if da.dataset:
@@ -646,7 +647,7 @@
else:
return self.false_path
- job = model.Job.get( self.job_id )
+ job = self.sa_session.query( model.Job ).get( self.job_id )
if self.app.config.outputs_to_working_directory:
self.output_paths = []
for name, data in [ ( da.name, da.dataset.dataset ) for da in job.output_datasets + job.output_library_datasets ]:
@@ -709,12 +710,12 @@
return sizes
def setup_external_metadata( self, exec_dir = None, tmp_dir = None, dataset_files_path = None, config_root = None, datatypes_config = None, **kwds ):
# extension could still be 'auto' if this is the upload tool.
- job = model.Job.get( self.job_id )
+ job = self.sa_session.query( model.Job ).get( self.job_id )
for output_dataset_assoc in job.output_datasets:
if output_dataset_assoc.dataset.ext == 'auto':
context = self.get_dataset_finish_context( dict(), output_dataset_assoc.dataset.dataset )
output_dataset_assoc.dataset.extension = context.get( 'ext', 'data' )
- mapping.context.current.flush()
+ self.sa_session.flush()
if tmp_dir is None:
#this dir should should relative to the exec_dir
tmp_dir = self.app.config.new_file_path
@@ -724,7 +725,14 @@
config_root = self.app.config.root
if datatypes_config is None:
datatypes_config = self.app.config.datatypes_config
- return self.external_output_metadata.setup_external_metadata( [ output_dataset_assoc.dataset for output_dataset_assoc in job.output_datasets ], exec_dir = exec_dir, tmp_dir = tmp_dir, dataset_files_path = dataset_files_path, config_root = config_root, datatypes_config = datatypes_config, **kwds )
+ return self.external_output_metadata.setup_external_metadata( [ output_dataset_assoc.dataset for output_dataset_assoc in job.output_datasets ],
+ self.sa_session,
+ exec_dir = exec_dir,
+ tmp_dir = tmp_dir,
+ dataset_files_path = dataset_files_path,
+ config_root = config_root,
+ datatypes_config = datatypes_config,
+ **kwds )
class DefaultJobDispatcher( object ):
def __init__( self, app ):
@@ -772,6 +780,7 @@
STOP_SIGNAL = object()
def __init__( self, app, dispatcher ):
self.app = app
+ self.sa_session = app.model.context
self.dispatcher = dispatcher
# Keep track of the pid that started the job manager, only it
@@ -821,8 +830,8 @@
pass
for job_id, error_msg in jobs:
- job = model.Job.get( job_id )
- job.refresh()
+ job = self.sa_session.query( model.Job ).get( job_id )
+ self.sa_session.refresh( job )
# if desired, error the job so we can inform the user.
if error_msg is not None:
job.state = job.states.ERROR
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/model/__init__.py
--- a/lib/galaxy/model/__init__.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/model/__init__.py Thu Oct 22 23:02:28 2009 -0400
@@ -1292,8 +1292,7 @@
if self.phone:
html = html + '<br/>' + 'Phone: ' + self.phone
return html
-
-
+
class Page( object ):
def __init__( self ):
self.id = None
@@ -1330,8 +1329,7 @@
def __str__ ( self ):
return "%s(item_id=%s, item_tag=%s, user_tname=%s, value=%s, user_value=%s)" % (self.__class__.__name__, self.item_id, self.tag_id, self.user_tname, self.value. self.user_value )
-
-
+
class HistoryTagAssociation ( ItemTagAssociation ):
pass
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/model/mapping_tests.py
--- a/lib/galaxy/model/mapping_tests.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/model/mapping_tests.py Thu Oct 22 23:02:28 2009 -0400
@@ -22,13 +22,13 @@
model.context.current.flush()
model.context.current.clear()
# Check
- users = model.User.query().all()
+ users = model.context.current.query( model.User ).all()
assert len( users ) == 1
assert users[0].email == "james(a)foo.bar.baz"
assert users[0].password == "password"
assert len( users[0].histories ) == 1
assert users[0].histories[0].name == "History 1"
- hists = model.History.query().all()
+ hists = model.context.current.query( model.History ).all()
assert hists[0].name == "History 1"
assert hists[1].name == ( "H" * 255 )
assert hists[0].user == users[0]
@@ -40,7 +40,7 @@
hists[1].name = "History 2b"
model.context.current.flush()
model.context.current.clear()
- hists = model.History.query().all()
+ hists = model.context.current.query( model.History ).all()
assert hists[0].name == "History 1"
assert hists[1].name == "History 2b"
# gvk TODO need to ad test for GalaxySessions, but not yet sure what they should look like.
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/model/migrate/versions/0005_cleanup_datasets_fix.py
--- a/lib/galaxy/model/migrate/versions/0005_cleanup_datasets_fix.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/model/migrate/versions/0005_cleanup_datasets_fix.py Thu Oct 22 23:02:28 2009 -0400
@@ -662,7 +662,7 @@
log.debug( "Fixing a discrepancy concerning deleted shared history items." )
affected_items = 0
start_time = time.time()
- for dataset in Dataset.filter( and_( Dataset.c.deleted == True, Dataset.c.purged == False ) ).all():
+ for dataset in context.query( Dataset ).filter( and_( Dataset.c.deleted == True, Dataset.c.purged == False ) ):
for dataset_instance in dataset.history_associations + dataset.library_associations:
if not dataset_instance.deleted:
dataset.deleted = False
@@ -679,7 +679,7 @@
dataset_by_filename = {}
changed_associations = 0
start_time = time.time()
- for dataset in Dataset.filter( Dataset.external_filename.like( '%dataset_%.dat' ) ).all():
+ for dataset in context.query( Dataset ).filter( Dataset.external_filename.like( '%dataset_%.dat' ) ):
if dataset.file_name in dataset_by_filename:
guessed_dataset = dataset_by_filename[ dataset.file_name ]
else:
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/model/orm/ext/assignmapper.py
--- a/lib/galaxy/model/orm/ext/assignmapper.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/model/orm/ext/assignmapper.py Thu Oct 22 23:02:28 2009 -0400
@@ -3,20 +3,12 @@
with some compatibility fixes. It assumes that the session is a ScopedSession,
and thus has the "mapper" method to attach contextual mappers to a class. It
adds additional query and session methods to the class to support the
-SQLAlchemy 0.3 style of access. The following methods which would normally be
-accessed through "Object.query().method()" are available directly through the
-object:
+SQLAlchemy 0.3 style of access.
- 'get', 'filter', 'filter_by', 'select', 'select_by',
- 'selectfirst', 'selectfirst_by', 'selectone', 'selectone_by',
- 'get_by', 'join_to', 'join_via', 'count', 'count_by',
- 'options', 'instances'
-
-Additionally, the following Session methods, which normally accept an instance
+The following Session methods, which normally accept an instance
or list of instances, are available directly through the objects, e.g.
"Session.flush( [instance] )" can be performed as "instance.flush()":
- 'refresh', 'expire', 'delete', 'expunge', 'update'
"""
__all__ = [ 'assign_mapper' ]
@@ -24,17 +16,6 @@
from sqlalchemy import util, exceptions
import types
from sqlalchemy.orm import mapper, Query
-
-def _monkeypatch_query_method( name, session, class_ ):
- def do(self, *args, **kwargs):
- ## util.warn_deprecated('Query methods on the class are deprecated; use %s.query.%s instead' % (class_.__name__, name))
- return getattr( class_.query, name)(*args, **kwargs)
- try:
- do.__name__ = name
- except:
- pass
- if not hasattr(class_, name):
- setattr(class_, name, classmethod(do))
def _monkeypatch_session_method(name, session, class_, make_list=False):
def do(self, *args, **kwargs):
@@ -50,13 +31,6 @@
def assign_mapper( session, class_, *args, **kwargs ):
m = class_.mapper = session.mapper( class_, *args, **kwargs )
- for name in ('get', 'filter', 'filter_by', 'select', 'select_by',
- 'selectfirst', 'selectfirst_by', 'selectone', 'selectone_by',
- 'get_by', 'join_to', 'join_via', 'count', 'count_by',
- 'options', 'instances'):
- _monkeypatch_query_method(name, session, class_)
- for name in ('refresh', 'expire', 'delete', 'expunge', 'update'):
- _monkeypatch_session_method(name, session, class_)
for name in ( 'flush', ):
_monkeypatch_session_method( name, session, class_, make_list=True )
return m
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/security/__init__.py
--- a/lib/galaxy/security/__init__.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/security/__init__.py Thu Oct 22 23:02:28 2009 -0400
@@ -205,8 +205,10 @@
self.associate_components( role=role, user=user )
return role
def get_private_user_role( self, user, auto_create=False ):
- role = self.model.Role.filter( and_( self.model.Role.table.c.name == user.email,
- self.model.Role.table.c.type == self.model.Role.types.PRIVATE ) ).first()
+ role = self.sa_session.query( self.model.Role ) \
+ .filter( and_( self.model.Role.table.c.name == user.email,
+ self.model.Role.table.c.type == self.model.Role.types.PRIVATE ) ) \
+ .first()
if not role:
if auto_create:
return self.create_private_user_role( user )
@@ -225,7 +227,7 @@
permissions[ self.permitted_actions.DATASET_ACCESS ] = permissions.values()[ 0 ]
# Delete all of the current default permissions for the user
for dup in user.default_permissions:
- dup.delete()
+ self.sa_session.delete( dup )
dup.flush()
# Add the new default permissions for the user
for action, roles in permissions.items():
@@ -255,7 +257,7 @@
permissions = self.user_get_default_permissions( user )
# Delete all of the current default permission for the history
for dhp in history.default_permissions:
- dhp.delete()
+ self.sa_session.delete( dhp )
dhp.flush()
# Add the new default permissions for the history
for action, roles in permissions.items():
@@ -293,7 +295,7 @@
# TODO: If setting ACCESS permission, at least 1 user must have every role associated with this dataset,
# or the dataset is inaccessible. See admin/library_dataset_dataset_association()
for dp in dataset.actions:
- dp.delete()
+ self.sa_session.delete( dp )
dp.flush()
# Add the new permissions on the dataset
for action, roles in permissions.items():
@@ -314,7 +316,7 @@
# Delete the current specific permission on the dataset if one exists
for dp in dataset.actions:
if dp.action == action:
- dp.delete()
+ self.sa_session.delete( dp )
dp.flush()
# Add the new specific permission on the dataset
for dp in [ self.model.DatasetPermissions( action, dataset, role ) for role in roles ]:
@@ -328,7 +330,7 @@
# other actions ( 'manage permissions', 'edit metadata' ) are irrelevant.
for dp in dataset.actions:
if dp.action == self.permitted_actions.DATASET_ACCESS.action:
- dp.delete()
+ self.sa_session.delete( dp )
dp.flush()
def get_dataset_permissions( self, dataset ):
"""
@@ -379,7 +381,7 @@
def set_all_library_permissions( self, library_item, permissions={} ):
# Set new permissions on library_item, eliminating all current permissions
for role_assoc in library_item.actions:
- role_assoc.delete()
+ self.sa_session.delete( role_assoc )
role_assoc.flush()
# Add the new permissions on library_item
for item_class, permission_class in self.library_item_assocs:
@@ -472,9 +474,9 @@
for user in users:
if delete_existing_assocs:
for a in user.non_private_roles + user.groups:
- a.delete()
+ self.sa_session.delete( a )
a.flush()
- user.refresh()
+ self.sa_session.refresh( user )
for role in roles:
# Make sure we are not creating an additional association with a PRIVATE role
if role not in user.roles:
@@ -485,7 +487,7 @@
for group in groups:
if delete_existing_assocs:
for a in group.roles + group.users:
- a.delete()
+ self.sa_session.delete( a )
a.flush()
for role in roles:
self.associate_components( group=group, role=role )
@@ -495,7 +497,7 @@
for role in roles:
if delete_existing_assocs:
for a in role.users + role.groups:
- a.delete()
+ self.sa_session.delete( a )
a.flush()
for user in users:
self.associate_components( user=user, role=role )
@@ -505,15 +507,15 @@
assert len( kwd ) == 2, 'You must specify exactly 2 Galaxy security components to check for associations.'
if 'dataset' in kwd:
if 'action' in kwd:
- return self.model.DatasetPermissions.filter_by( action = kwd['action'].action, dataset_id = kwd['dataset'].id ).first()
+ return self.sa_session.query( self.model.DatasetPermissions ).filter_by( action = kwd['action'].action, dataset_id = kwd['dataset'].id ).first()
elif 'user' in kwd:
if 'group' in kwd:
- return self.model.UserGroupAssociation.filter_by( group_id = kwd['group'].id, user_id = kwd['user'].id ).first()
+ return self.sa_session.query( self.model.UserGroupAssociation ).filter_by( group_id = kwd['group'].id, user_id = kwd['user'].id ).first()
elif 'role' in kwd:
- return self.model.UserRoleAssociation.filter_by( role_id = kwd['role'].id, user_id = kwd['user'].id ).first()
+ return self.sa_session.query( self.model.UserRoleAssociation ).filter_by( role_id = kwd['role'].id, user_id = kwd['user'].id ).first()
elif 'group' in kwd:
if 'role' in kwd:
- return self.model.GroupRoleAssociation.filter_by( role_id = kwd['role'].id, group_id = kwd['group'].id ).first()
+ return self.sa_session.query( self.model.GroupRoleAssociation ).filter_by( role_id = kwd['role'].id, group_id = kwd['group'].id ).first()
raise 'No valid method of associating provided components: %s' % kwd
def check_folder_contents( self, user, roles, folder, hidden_folder_ids='' ):
"""
@@ -526,11 +528,11 @@
"""
action = self.permitted_actions.DATASET_ACCESS
lddas = self.sa_session.query( self.model.LibraryDatasetDatasetAssociation ) \
- .join( "library_dataset" ) \
- .filter( self.model.LibraryDataset.folder == folder ) \
- .join( "dataset" ) \
- .options( eagerload_all( "dataset.actions" ) ) \
- .all()
+ .join( "library_dataset" ) \
+ .filter( self.model.LibraryDataset.folder == folder ) \
+ .join( "dataset" ) \
+ .options( eagerload_all( "dataset.actions" ) ) \
+ .all()
for ldda in lddas:
ldda_access_permissions = self.get_item_actions( action, ldda.dataset )
if not ldda_access_permissions:
@@ -573,7 +575,8 @@
if action == self.permitted_actions.DATASET_ACCESS and action.action not in [ dp.action for dp in hda.dataset.actions ]:
log.debug( 'Allowing access to public dataset with hda: %i.' % hda.id )
return True # dataset has no roles associated with the access permission, thus is already public
- hdadaa = self.model.HistoryDatasetAssociationDisplayAtAuthorization.filter_by( history_dataset_association_id = hda.id ).first()
+ hdadaa = self.sa_session.query( self.model.HistoryDatasetAssociationDisplayAtAuthorization ) \
+ .filter_by( history_dataset_association_id = hda.id ).first()
if not hdadaa:
log.debug( 'Denying access to private dataset with hda: %i. No hdadaa record for this dataset.' % hda.id )
return False # no auth
@@ -602,7 +605,8 @@
else:
raise 'The dataset access permission is the only valid permission in the host security agent.'
def set_dataset_permissions( self, hda, user, site ):
- hdadaa = self.model.HistoryDatasetAssociationDisplayAtAuthorization.filter_by( history_dataset_association_id = hda.id ).first()
+ hdadaa = self.sa_session.query( self.model.HistoryDatasetAssociationDisplayAtAuthorization ) \
+ .filter_by( history_dataset_association_id = hda.id ).first()
if hdadaa:
hdadaa.update_time = datetime.utcnow()
else:
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/tags/tag_handler.py
--- a/lib/galaxy/tags/tag_handler.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/tags/tag_handler.py Thu Oct 22 23:02:28 2009 -0400
@@ -21,7 +21,7 @@
def get_tag_assoc_class(self, entity_class):
return self.tag_assoc_classes[entity_class]
- def remove_item_tag(self, item, tag_name):
+ def remove_item_tag( self, trans, item, tag_name ):
"""Remove a tag from an item."""
# Get item tag association.
item_tag_assoc = self._get_item_tag_assoc(item, tag_name)
@@ -29,17 +29,17 @@
# Remove association.
if item_tag_assoc:
# Delete association.
- item_tag_assoc.delete()
+ trans.sa_session.delete( item_tag_assoc )
item.tags.remove(item_tag_assoc)
return True
return False
- def delete_item_tags(self, item):
+ def delete_item_tags( self, trans, item ):
"""Delete tags from an item."""
# Delete item-tag associations.
for tag in item.tags:
- tag.delete()
+ trans.sa_ession.delete( tag )
# Delete tags from item.
del item.tags[:]
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/tools/__init__.py
--- a/lib/galaxy/tools/__init__.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/tools/__init__.py Thu Oct 22 23:02:28 2009 -0400
@@ -167,7 +167,7 @@
which is encoded in the tool panel.
"""
id = self.app.security.decode_id( workflow_id )
- stored = self.app.model.StoredWorkflow.get( id )
+ stored = self.app.model.context.query( self.app.model.StoredWorkflow ).get( id )
return stored.latest_workflow
class ToolSection( object ):
@@ -863,7 +863,7 @@
if 'async_datasets' in inputs and inputs['async_datasets'] not in [ 'None', '', None ]:
for id in inputs['async_datasets'].split(','):
try:
- data = trans.model.HistoryDatasetAssociation.get( int( id ) )
+ data = trans.sa_session.query( trans.model.HistoryDatasetAssociation ).get( int( id ) )
except:
log.exception( 'Unable to load precreated dataset (%s) sent in upload form' % id )
continue
@@ -1567,7 +1567,7 @@
def exec_after_process( self, app, inp_data, out_data, param_dict, job = None ):
for name, dataset in inp_data.iteritems():
external_metadata = galaxy.datatypes.metadata.JobExternalOutputMetadataWrapper( job )
- if external_metadata.external_metadata_set_successfully( dataset ):
+ if external_metadata.external_metadata_set_successfully( dataset, app.model.context ):
dataset.metadata.from_JSON_dict( external_metadata.get_output_filenames_by_dataset( dataset ).filename_out )
# If setting external metadata has failed, how can we inform the user?
# For now, we'll leave the default metadata and set the state back to its original.
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/tools/actions/__init__.py
--- a/lib/galaxy/tools/actions/__init__.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/tools/actions/__init__.py Thu Oct 22 23:02:28 2009 -0400
@@ -32,7 +32,8 @@
def visitor( prefix, input, value, parent = None ):
def process_dataset( data ):
if data and not isinstance( data.datatype, input.formats ):
- data.refresh() #need to refresh in case this conversion just took place, i.e. input above in tool performed the same conversion
+ # Need to refresh in case this conversion just took place, i.e. input above in tool performed the same conversion
+ trans.sa_session.refresh( data )
target_ext, converted_dataset = data.find_conversion_destination( input.formats, converter_safe = input.converter_safe( param_values, trans ) )
if target_ext:
if converted_dataset:
@@ -172,7 +173,7 @@
# this happens i.e. as a result of the async controller
if name in incoming:
dataid = incoming[name]
- data = trans.app.model.HistoryDatasetAssociation.get( dataid )
+ data = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( dataid )
assert data != None
out_data[name] = data
else:
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/tools/actions/metadata.py
--- a/lib/galaxy/tools/actions/metadata.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/tools/actions/metadata.py Thu Oct 22 23:02:28 2009 -0400
@@ -29,9 +29,18 @@
job.flush() #ensure job.id is available
#add parameters to job_parameter table
- incoming[ '__ORIGINAL_DATASET_STATE__' ] = dataset.state #store original dataset state, so we can restore it. A seperate table might be better (no chance of 'loosing' the original state)?
+ # Store original dataset state, so we can restore it. A separate table might be better (no chance of 'losing' the original state)?
+ incoming[ '__ORIGINAL_DATASET_STATE__' ] = dataset.state
external_metadata_wrapper = JobExternalOutputMetadataWrapper( job )
- cmd_line = external_metadata_wrapper.setup_external_metadata( dataset, exec_dir = None, tmp_dir = trans.app.config.new_file_path, dataset_files_path = trans.app.model.Dataset.file_path, output_fnames = None, config_root = None, datatypes_config = None, kwds = { 'overwrite' : True } )
+ cmd_line = external_metadata_wrapper.setup_external_metadata( dataset,
+ trans.sa_session,
+ exec_dir = None,
+ tmp_dir = trans.app.config.new_file_path,
+ dataset_files_path = trans.app.model.Dataset.file_path,
+ output_fnames = None,
+ config_root = None,
+ datatypes_config = None,
+ kwds = { 'overwrite' : True } )
incoming[ '__SET_EXTERNAL_METADATA_COMMAND_LINE__' ] = cmd_line
for name, value in tool.params_to_strings( incoming, trans.app ).iteritems():
job.add_parameter( name, value )
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/tools/actions/upload_common.py
--- a/lib/galaxy/tools/actions/upload_common.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/tools/actions/upload_common.py Thu Oct 22 23:02:28 2009 -0400
@@ -41,12 +41,12 @@
# See if we have any template field contents
library_bunch.template_field_contents = []
template_id = params.get( 'template_id', None )
- library_bunch.folder = trans.app.model.LibraryFolder.get( folder_id )
+ library_bunch.folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( folder_id )
# We are inheriting the folder's info_association, so we did not
# receive any inherited contents, but we may have redirected here
# after the user entered template contents ( due to errors ).
if template_id not in [ None, 'None' ]:
- library_bunch.template = trans.app.model.FormDefinition.get( template_id )
+ library_bunch.template = trans.sa_session.query( trans.app.model.FormDefinition ).get( template_id )
for field_index in range( len( library_bunch.template.fields ) ):
field_name = 'field_%i' % field_index
if params.get( field_name, False ):
@@ -56,7 +56,8 @@
library_bunch.template = None
library_bunch.roles = []
for role_id in util.listify( params.get( 'roles', [] ) ):
- library_bunch.roles.append( trans.app.model.Role.get( role_id ) )
+ role = trans.sa_session.query( trans.app.model.Role ).get( role_id )
+ library_bunch.roles.append( role )
return library_bunch
def get_precreated_datasets( trans, params, data_obj, controller='root' ):
@@ -132,7 +133,7 @@
if uploaded_dataset.get( 'in_folder', False ):
# Create subfolders if desired
for name in uploaded_dataset.in_folder.split( os.path.sep ):
- folder.refresh()
+ trans.sa_session.refresh( folder )
matches = filter( lambda x: x.name == name, active_folders( trans, folder ) )
if matches:
folder = matches[0]
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/tools/parameters/basic.py Thu Oct 22 23:02:28 2009 -0400
@@ -1239,7 +1239,7 @@
elif isinstance( value, trans.app.model.HistoryDatasetAssociation ):
return value
else:
- return trans.app.model.HistoryDatasetAssociation.get( value )
+ return trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( value )
def to_string( self, value, app ):
if value is None or isinstance( value, str ):
@@ -1253,7 +1253,7 @@
# indicates that the dataset is optional, while '' indicates that it is not.
if value is None or value == '' or value == 'None':
return value
- return app.model.HistoryDatasetAssociation.get( int( value ) )
+ return app.model.context.query( app.model.HistoryDatasetAssociation ).get( int( value ) )
def to_param_dict_string( self, value, other_values={} ):
if value is None: return "None"
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/admin.py
--- a/lib/galaxy/web/controllers/admin.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/admin.py Thu Oct 22 23:02:28 2009 -0400
@@ -119,9 +119,9 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- roles = trans.app.model.Role.filter( and_( trans.app.model.Role.table.c.deleted==False,
- trans.app.model.Role.table.c.type != trans.app.model.Role.types.PRIVATE ) ) \
- .order_by( trans.app.model.Role.table.c.name ).all()
+ roles = trans.sa_session.query( trans.app.model.Role ).filter( and_( trans.app.model.Role.table.c.deleted==False,
+ trans.app.model.Role.table.c.type != trans.app.model.Role.types.PRIVATE ) ) \
+ .order_by( trans.app.model.Role.table.c.name )
return trans.fill_template( '/admin/dataset_security/roles.mako',
roles=roles,
msg=msg,
@@ -140,18 +140,18 @@
create_group_for_role = params.get( 'create_group_for_role', 'no' )
if not name or not description:
msg = "Enter a valid name and a description"
- elif trans.app.model.Role.filter( trans.app.model.Role.table.c.name==name ).first():
+ elif trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.name==name ).first():
msg = "A role with that name already exists"
else:
# Create the role
role = trans.app.model.Role( name=name, description=description, type=trans.app.model.Role.types.ADMIN )
role.flush()
# Create the UserRoleAssociations
- for user in [ trans.app.model.User.get( x ) for x in in_users ]:
+ for user in [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in in_users ]:
ura = trans.app.model.UserRoleAssociation( user, role )
ura.flush()
# Create the GroupRoleAssociations
- for group in [ trans.app.model.Group.get( x ) for x in in_groups ]:
+ for group in [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in in_groups ]:
gra = trans.app.model.GroupRoleAssociation( group, role )
gra.flush()
if create_group_for_role == 'yes':
@@ -165,10 +165,14 @@
trans.response.send_redirect( web.url_for( controller='admin', action='roles', msg=util.sanitize_text( msg ), messagetype='done' ) )
trans.response.send_redirect( web.url_for( controller='admin', action='create_role', msg=util.sanitize_text( msg ), messagetype='error' ) )
out_users = []
- for user in trans.app.model.User.filter( trans.app.model.User.table.c.deleted==False ).order_by( trans.app.model.User.table.c.email ).all():
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
out_users.append( ( user.id, user.email ) )
out_groups = []
- for group in trans.app.model.Group.filter( trans.app.model.Group.table.c.deleted==False ).order_by( trans.app.model.Group.table.c.name ).all():
+ for group in trans.sa_session.query( trans.app.model.Group ) \
+ .filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name ):
out_groups.append( ( group.id, group.name ) )
return trans.fill_template( '/admin/dataset_security/role_create.mako',
in_users=[],
@@ -183,26 +187,26 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- role = trans.app.model.Role.get( int( params.role_id ) )
+ role = trans.sa_session.query( trans.app.model.Role ).get( int( params.role_id ) )
if params.get( 'role_members_edit_button', False ):
- in_users = [ trans.app.model.User.get( x ) for x in util.listify( params.in_users ) ]
+ in_users = [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in util.listify( params.in_users ) ]
for ura in role.users:
- user = trans.app.model.User.get( ura.user_id )
+ user = trans.sa_session.query( trans.app.model.User ).get( ura.user_id )
if user not in in_users:
# Delete DefaultUserPermissions for previously associated users that have been removed from the role
for dup in user.default_permissions:
if role == dup.role:
- dup.delete()
+ trans.sa_session.delete( dup )
dup.flush()
# Delete DefaultHistoryPermissions for previously associated users that have been removed from the role
for history in user.histories:
for dhp in history.default_permissions:
if role == dhp.role:
- dhp.delete()
+ trans.sa_session.delete( dhp )
dhp.flush()
- in_groups = [ trans.app.model.Group.get( x ) for x in util.listify( params.in_groups ) ]
+ in_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( params.in_groups ) ]
trans.app.security_agent.set_entity_role_associations( roles=[ role ], users=in_users, groups=in_groups )
- role.refresh()
+ trans.sa_session.refresh( role )
msg = "Role '%s' has been updated with %d associated users and %d associated groups" % ( role.name, len( in_users ), len( in_groups ) )
trans.response.send_redirect( web.url_for( action='roles', msg=util.sanitize_text( msg ), messagetype=messagetype ) )
elif params.get( 'rename', False ):
@@ -213,7 +217,7 @@
if not new_name:
msg = 'Enter a valid name'
return trans.fill_template( '/admin/dataset_security/role_rename.mako', role=role, msg=msg, messagetype='error' )
- elif trans.app.model.Role.filter( trans.app.model.Role.table.c.name==new_name ).first():
+ elif trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.name==new_name ).first():
msg = 'A role with that name already exists'
return trans.fill_template( '/admin/dataset_security/role_rename.mako', role=role, msg=msg, messagetype='error' )
else:
@@ -227,12 +231,16 @@
out_users = []
in_groups = []
out_groups = []
- for user in trans.app.model.User.filter( trans.app.model.User.table.c.deleted==False ).order_by( trans.app.model.User.table.c.email ).all():
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
if user in [ x.user for x in role.users ]:
in_users.append( ( user.id, user.email ) )
else:
out_users.append( ( user.id, user.email ) )
- for group in trans.app.model.Group.filter( trans.app.model.Group.table.c.deleted==False ).order_by( trans.app.model.Group.table.c.name ).all():
+ for group in trans.sa_session.query( trans.app.model.Group ) \
+ .filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name ):
if group in [ x.group for x in role.groups ]:
in_groups.append( ( group.id, group.name ) )
else:
@@ -242,9 +250,8 @@
# [ ( LibraryDatasetDatasetAssociation [ action, action ] ) ]
library_dataset_actions = {}
for dp in role.dataset_actions:
- for ldda in trans.app.model.LibraryDatasetDatasetAssociation \
- .filter( trans.app.model.LibraryDatasetDatasetAssociation.dataset_id==dp.dataset_id ) \
- .all():
+ for ldda in trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ) \
+ .filter( trans.app.model.LibraryDatasetDatasetAssociation.dataset_id==dp.dataset_id ):
root_found = False
folder_path = ''
folder = ldda.library_dataset.folder
@@ -255,7 +262,9 @@
else:
folder = folder.parent
folder_path = '%s %s' % ( folder_path, ldda.name )
- library = trans.app.model.Library.filter( trans.app.model.Library.table.c.root_folder_id == folder.id ).first()
+ library = trans.sa_session.query( trans.app.model.Library ) \
+ .filter( trans.app.model.Library.table.c.root_folder_id == folder.id ) \
+ .first()
if library not in library_dataset_actions:
library_dataset_actions[ library ] = {}
try:
@@ -275,7 +284,7 @@
@web.require_admin
def mark_role_deleted( self, trans, **kwd ):
params = util.Params( kwd )
- role = trans.app.model.Role.get( int( params.role_id ) )
+ role = trans.sa_session.query( trans.app.model.Role ).get( int( params.role_id ) )
role.deleted = True
role.flush()
msg = "Role '%s' has been marked as deleted." % role.name
@@ -286,10 +295,9 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- roles = trans.app.model.Role.query() \
- .filter( trans.app.model.Role.table.c.deleted==True ) \
- .order_by( trans.app.model.Role.table.c.name ) \
- .all()
+ roles = trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==True ) \
+ .order_by( trans.app.model.Role.table.c.name )
return trans.fill_template( '/admin/dataset_security/deleted_roles.mako',
roles=roles,
msg=msg,
@@ -298,7 +306,7 @@
@web.require_admin
def undelete_role( self, trans, **kwd ):
params = util.Params( kwd )
- role = trans.app.model.Role.get( int( params.role_id ) )
+ role = trans.sa_session.query( trans.app.model.Role ).get( int( params.role_id ) )
role.deleted = False
role.flush()
msg = "Role '%s' has been marked as not deleted." % role.name
@@ -314,34 +322,34 @@
# - GroupRoleAssociations where role_id == Role.id
# - DatasetPermissionss where role_id == Role.id
params = util.Params( kwd )
- role = trans.app.model.Role.get( int( params.role_id ) )
+ role = trans.sa_session.query( trans.app.model.Role ).get( int( params.role_id ) )
if not role.deleted:
# We should never reach here, but just in case there is a bug somewhere...
msg = "Role '%s' has not been deleted, so it cannot be purged." % role.name
trans.response.send_redirect( web.url_for( action='roles', msg=util.sanitize_text( msg ), messagetype='error' ) )
# Delete UserRoleAssociations
for ura in role.users:
- user = trans.app.model.User.get( ura.user_id )
+ user = trans.sa_session.query( trans.app.model.User ).get( ura.user_id )
# Delete DefaultUserPermissions for associated users
for dup in user.default_permissions:
if role == dup.role:
- dup.delete()
+ trans.sa_session.delete( dup )
dup.flush()
# Delete DefaultHistoryPermissions for associated users
for history in user.histories:
for dhp in history.default_permissions:
if role == dhp.role:
- dhp.delete()
+ trans.sa_session.delete( dhp )
dhp.flush()
- ura.delete()
+ trans.sa_session.delete( ura )
ura.flush()
# Delete GroupRoleAssociations
for gra in role.groups:
- gra.delete()
+ trans.sa_session.delete( gra )
gra.flush()
# Delete DatasetPermissionss
for dp in role.dataset_actions:
- dp.delete()
+ trans.sa_session.delete( dp )
dp.flush()
msg = "The following have been purged from the database for role '%s': " % role.name
msg += "DefaultUserPermissions, DefaultHistoryPermissions, UserRoleAssociations, GroupRoleAssociations, DatasetPermissionss."
@@ -354,10 +362,9 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- groups = trans.app.model.Group.query() \
- .filter( trans.app.model.Group.table.c.deleted==False ) \
- .order_by( trans.app.model.Group.table.c.name ) \
- .all()
+ groups = trans.sa_session.query( trans.app.model.Group ) \
+ .filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name )
return trans.fill_template( '/admin/dataset_security/groups.mako',
groups=groups,
msg=msg,
@@ -368,12 +375,12 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- group = trans.app.model.Group.get( int( params.group_id ) )
+ group = trans.sa_session.query( trans.app.model.Group ).get( int( params.group_id ) )
if params.get( 'group_roles_users_edit_button', False ):
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( params.in_roles ) ]
- in_users = [ trans.app.model.User.get( x ) for x in util.listify( params.in_users ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.in_roles ) ]
+ in_users = [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in util.listify( params.in_users ) ]
trans.app.security_agent.set_entity_group_associations( groups=[ group ], roles=in_roles, users=in_users )
- group.refresh()
+ trans.sa_session.refresh( group )
msg += "Group '%s' has been updated with %d associated roles and %d associated users" % ( group.name, len( in_roles ), len( in_users ) )
trans.response.send_redirect( web.url_for( action='groups', msg=util.sanitize_text( msg ), messagetype=messagetype ) )
if params.get( 'rename', False ):
@@ -383,7 +390,7 @@
if not new_name:
msg = 'Enter a valid name'
return trans.fill_template( '/admin/dataset_security/group_rename.mako', group=group, msg=msg, messagetype='error' )
- elif trans.app.model.Group.filter( trans.app.model.Group.table.c.name==new_name ).first():
+ elif trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.name==new_name ).first():
msg = 'A group with that name already exists'
return trans.fill_template( '/admin/dataset_security/group_rename.mako', group=group, msg=msg, messagetype='error' )
else:
@@ -396,12 +403,16 @@
out_roles = []
in_users = []
out_users = []
- for role in trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.table.c.name ).all():
+ for role in trans.sa_session.query(trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name ):
if role in [ x.role for x in group.roles ]:
in_roles.append( ( role.id, role.name ) )
else:
out_roles.append( ( role.id, role.name ) )
- for user in trans.app.model.User.filter( trans.app.model.User.table.c.deleted==False ).order_by( trans.app.model.User.table.c.email ).all():
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
if user in [ x.user for x in group.users ]:
in_users.append( ( user.id, user.email ) )
else:
@@ -427,28 +438,32 @@
in_roles = util.listify( params.get( 'in_roles', [] ) )
if not name:
msg = "Enter a valid name"
- elif trans.app.model.Group.filter( trans.app.model.Group.table.c.name==name ).first():
+ elif trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.name==name ).first():
msg = "A group with that name already exists"
else:
# Create the group
group = trans.app.model.Group( name=name )
group.flush()
# Create the UserRoleAssociations
- for user in [ trans.app.model.User.get( x ) for x in in_users ]:
+ for user in [ trans.sa_session.query( trans.app.model.User ).get( x ) for x in in_users ]:
uga = trans.app.model.UserGroupAssociation( user, group )
uga.flush()
# Create the GroupRoleAssociations
- for role in [ trans.app.model.Role.get( x ) for x in in_roles ]:
+ for role in [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in in_roles ]:
gra = trans.app.model.GroupRoleAssociation( group, role )
gra.flush()
msg = "Group '%s' has been created with %d associated users and %d associated roles" % ( name, len( in_users ), len( in_roles ) )
trans.response.send_redirect( web.url_for( controller='admin', action='groups', msg=util.sanitize_text( msg ), messagetype='done' ) )
trans.response.send_redirect( web.url_for( controller='admin', action='create_group', msg=util.sanitize_text( msg ), messagetype='error' ) )
out_users = []
- for user in trans.app.model.User.filter( trans.app.model.User.table.c.deleted==False ).order_by( trans.app.model.User.table.c.email ).all():
+ for user in trans.sa_session.query( trans.app.model.User ) \
+ .filter( trans.app.model.User.table.c.deleted==False ) \
+ .order_by( trans.app.model.User.table.c.email ):
out_users.append( ( user.id, user.email ) )
out_roles = []
- for role in trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.table.c.name ).all():
+ for role in trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name ):
out_roles.append( ( role.id, role.name ) )
return trans.fill_template( '/admin/dataset_security/group_create.mako',
in_users=[],
@@ -461,7 +476,7 @@
@web.require_admin
def mark_group_deleted( self, trans, **kwd ):
params = util.Params( kwd )
- group = trans.app.model.Group.get( int( params.group_id ) )
+ group = trans.sa_session.query( trans.app.model.Group ).get( int( params.group_id ) )
group.deleted = True
group.flush()
msg = "Group '%s' has been marked as deleted." % group.name
@@ -472,10 +487,9 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- groups = trans.app.model.Group.query() \
- .filter( trans.app.model.Group.table.c.deleted==True ) \
- .order_by( trans.app.model.Group.table.c.name ) \
- .all()
+ groups = trans.sa_session.query( trans.app.model.Group ) \
+ .filter( trans.app.model.Group.table.c.deleted==True ) \
+ .order_by( trans.app.model.Group.table.c.name )
return trans.fill_template( '/admin/dataset_security/deleted_groups.mako',
groups=groups,
msg=msg,
@@ -484,7 +498,7 @@
@web.require_admin
def undelete_group( self, trans, **kwd ):
params = util.Params( kwd )
- group = trans.app.model.Group.get( int( params.group_id ) )
+ group = trans.sa_session.query( trans.app.model.Group ).get( int( params.group_id ) )
group.deleted = False
group.flush()
msg = "Group '%s' has been marked as not deleted." % group.name
@@ -495,18 +509,18 @@
# This method should only be called for a Group that has previously been deleted.
# Purging a deleted Group simply deletes all UserGroupAssociations and GroupRoleAssociations.
params = util.Params( kwd )
- group = trans.app.model.Group.get( int( params.group_id ) )
+ group = trans.sa_session.query( trans.app.model.Group ).get( int( params.group_id ) )
if not group.deleted:
# We should never reach here, but just in case there is a bug somewhere...
msg = "Group '%s' has not been deleted, so it cannot be purged." % group.name
trans.response.send_redirect( web.url_for( action='groups', msg=util.sanitize_text( msg ), messagetype='error' ) )
# Delete UserGroupAssociations
for uga in group.users:
- uga.delete()
+ trans.sa_session.delete( uga )
uga.flush()
# Delete GroupRoleAssociations
for gra in group.roles:
- gra.delete()
+ trans.sa_session.delete( gra )
gra.flush()
# Delete the Group
msg = "The following have been purged from the database for group '%s': UserGroupAssociations, GroupRoleAssociations." % group.name
@@ -538,7 +552,7 @@
message = 'Enter a real email address'
elif len( email) > 255:
message = 'Email address exceeds maximum allowable length'
- elif trans.app.model.User.filter_by( email=email ).all():
+ elif trans.sa_session.query( trans.app.model.User ).filter_by( email=email ).first():
message = 'User with that email already exists'
elif len( password ) < 6:
message = 'Use a password of at least 6 characters'
@@ -683,10 +697,10 @@
private_role = trans.app.security_agent.get_private_user_role( user )
# Delete History
for h in user.active_histories:
- h.refresh()
+ trans.sa_session.refresh( h )
for hda in h.active_datasets:
# Delete HistoryDatasetAssociation
- d = trans.app.model.Dataset.get( hda.dataset_id )
+ d = trans.sa_session.query( trans.app.model.Dataset ).get( hda.dataset_id )
# Delete Dataset
if not d.deleted:
d.deleted = True
@@ -697,12 +711,12 @@
h.flush()
# Delete UserGroupAssociations
for uga in user.groups:
- uga.delete()
+ trans.sa_session.delete( uga )
uga.flush()
# Delete UserRoleAssociations EXCEPT FOR THE PRIVATE ROLE
for ura in user.roles:
if ura.role_id != private_role.id:
- ura.delete()
+ trans.sa_session.delete( ura )
ura.flush()
# Purge the user
user.purged = True
@@ -749,22 +763,22 @@
# Make sure the user is not dis-associating himself from his private role
out_roles = kwd.get( 'out_roles', [] )
if out_roles:
- out_roles = [ trans.app.model.Role.get( x ) for x in util.listify( out_roles ) ]
+ out_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( out_roles ) ]
if private_role in out_roles:
message += "You cannot eliminate a user's private role association. "
status = 'error'
in_roles = kwd.get( 'in_roles', [] )
if in_roles:
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( in_roles ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( in_roles ) ]
out_groups = kwd.get( 'out_groups', [] )
if out_groups:
- out_groups = [ trans.app.model.Group.get( x ) for x in util.listify( out_groups ) ]
+ out_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( out_groups ) ]
in_groups = kwd.get( 'in_groups', [] )
if in_groups:
- in_groups = [ trans.app.model.Group.get( x ) for x in util.listify( in_groups ) ]
+ in_groups = [ trans.sa_session.query( trans.app.model.Group ).get( x ) for x in util.listify( in_groups ) ]
if in_roles:
trans.app.security_agent.set_entity_user_associations( users=[ user ], roles=in_roles, groups=in_groups )
- user.refresh()
+ trans.sa_session.refresh( user )
message += "User '%s' has been updated with %d associated roles and %d associated groups (private roles are not displayed)" % \
( user.email, len( in_roles ), len( in_groups ) )
trans.response.send_redirect( web.url_for( action='users',
@@ -774,8 +788,8 @@
out_roles = []
in_groups = []
out_groups = []
- for role in trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ) \
- .order_by( trans.app.model.Role.table.c.name ).all():
+ for role in trans.sa_session.query( trans.app.model.Role ).filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name ):
if role in [ x.role for x in user.roles ]:
in_roles.append( ( role.id, role.name ) )
elif role.type != trans.app.model.Role.types.PRIVATE:
@@ -784,8 +798,8 @@
# role, which should always be in in_roles. The check above is added as an additional
# precaution, since for a period of time we were including private roles in the form fields.
out_roles.append( ( role.id, role.name ) )
- for group in trans.app.model.Group.filter( trans.app.model.Group.table.c.deleted==False ) \
- .order_by( trans.app.model.Group.table.c.name ).all():
+ for group in trans.sa_session.query( trans.app.model.Group ).filter( trans.app.model.Group.table.c.deleted==False ) \
+ .order_by( trans.app.model.Group.table.c.name ):
if group in [ x.group for x in user.groups ]:
in_groups.append( ( group.id, group.name ) )
else:
@@ -867,15 +881,13 @@
msg += ', '.join( deleted )
messagetype = 'done'
cutoff_time = datetime.utcnow() - timedelta( seconds=int( cutoff ) )
- jobs = trans.app.model.Job.filter(
- and_( trans.app.model.Job.table.c.update_time < cutoff_time,
- or_( trans.app.model.Job.c.state == trans.app.model.Job.states.NEW,
- trans.app.model.Job.c.state == trans.app.model.Job.states.QUEUED,
- trans.app.model.Job.c.state == trans.app.model.Job.states.RUNNING,
- trans.app.model.Job.c.state == trans.app.model.Job.states.UPLOAD,
- )
- )
- ).order_by(trans.app.model.Job.c.update_time.desc()).all()
+ jobs = trans.sa_session.query( trans.app.model.Job ) \
+ .filter( and_( trans.app.model.Job.table.c.update_time < cutoff_time,
+ or_( trans.app.model.Job.state == trans.app.model.Job.states.NEW,
+ trans.app.model.Job.state == trans.app.model.Job.states.QUEUED,
+ trans.app.model.Job.state == trans.app.model.Job.states.RUNNING,
+ trans.app.model.Job.state == trans.app.model.Job.states.UPLOAD ) ) ) \
+ .order_by( trans.app.model.Job.table.c.update_time.desc() )
last_updated = {}
for job in jobs:
delta = datetime.utcnow() - job.update_time
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/async.py
--- a/lib/galaxy/web/controllers/async.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/async.py Thu Oct 22 23:02:28 2009 -0400
@@ -52,7 +52,7 @@
if data_id:
if not URL:
return "No URL parameter was submitted for data %s" % data_id
- data = trans.model.HistoryDatasetAssociation.get( data_id )
+ data = trans.sa_session.query( trans.model.HistoryDatasetAssociation ).get( data_id )
if not data:
return "Data %s does not exist or has already been deleted" % data_id
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/dataset.py
--- a/lib/galaxy/web/controllers/dataset.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/dataset.py Thu Oct 22 23:02:28 2009 -0400
@@ -138,12 +138,12 @@
@web.expose
def errors( self, trans, id ):
- dataset = model.HistoryDatasetAssociation.get( id )
+ dataset = trans.sa_session.query( model.HistoryDatasetAssociation ).get( id )
return trans.fill_template( "dataset/errors.mako", dataset=dataset )
@web.expose
def stderr( self, trans, id ):
- dataset = model.HistoryDatasetAssociation.get( id )
+ dataset = trans.sa_session.query( model.HistoryDatasetAssociation ).get( id )
job = dataset.creating_job_associations[0].job
trans.response.set_content_type( 'text/plain' )
return job.stderr
@@ -157,7 +157,7 @@
if to_address is None:
return trans.show_error_message( "Sorry, error reporting has been disabled for this galaxy instance" )
# Get the dataset and associated job
- dataset = model.HistoryDatasetAssociation.get( id )
+ dataset = trans.sa_session.query( model.HistoryDatasetAssociation ).get( id )
job = dataset.creating_job_associations[0].job
# Get the name of the server hosting the Galaxy instance from which this report originated
host = trans.request.host
@@ -207,7 +207,7 @@
dataset_id = int( dataset_id )
except ValueError:
dataset_id = trans.security.decode_id( dataset_id )
- data = trans.app.model.HistoryDatasetAssociation.get( dataset_id )
+ data = data = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( dataset_id )
if not data:
raise paste.httpexceptions.HTTPRequestRangeNotSatisfiable( "Invalid reference dataset id: %s." % str( dataset_id ) )
user, roles = trans.get_user_and_roles()
@@ -305,7 +305,7 @@
def display_at( self, trans, dataset_id, filename=None, **kwd ):
"""Sets up a dataset permissions so it is viewable at an external site"""
site = filename
- data = trans.app.model.HistoryDatasetAssociation.get( dataset_id )
+ data = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( dataset_id )
if not data:
raise paste.httpexceptions.HTTPRequestRangeNotSatisfiable( "Invalid reference dataset id: %s." % str( dataset_id ) )
if 'display_url' not in kwd or 'redirect_url' not in kwd:
@@ -326,7 +326,7 @@
except ValueError, e:
return False
history = trans.get_history()
- data = self.app.model.HistoryDatasetAssociation.get( id )
+ data = trans.sa_session.query( self.app.model.HistoryDatasetAssociation ).get( id )
if data and data.undeletable:
# Walk up parent datasets to find the containing history
topmost_parent = data
@@ -390,12 +390,12 @@
new_history.flush()
target_history_ids.append( new_history.id )
if user:
- target_histories = [ hist for hist in map( trans.app.model.History.get, target_history_ids ) if ( hist is not None and hist.user == user )]
+ target_histories = [ hist for hist in map( trans.sa_session.query( trans.app.model.History ).get, target_history_ids ) if ( hist is not None and hist.user == user )]
else:
target_histories = [ history ]
if len( target_histories ) != len( target_history_ids ):
error_msg = error_msg + "You do not have permission to add datasets to %i requested histories. " % ( len( target_history_ids ) - len( target_histories ) )
- for data in map( trans.app.model.HistoryDatasetAssociation.get, source_dataset_ids ):
+ for data in map( trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get, source_dataset_ids ):
if data is None:
error_msg = error_msg + "You tried to copy a dataset that does not exist. "
invalid_datasets += 1
@@ -409,7 +409,7 @@
refresh_frames = ['history']
trans.app.model.flush()
done_msg = "%i datasets copied to %i histories." % ( len( source_dataset_ids ) - invalid_datasets, len( target_histories ) )
- history.refresh()
+ trans.sa_session.refresh( history )
elif create_new_history:
target_history_ids.append( "create_new_history" )
source_datasets = history.active_datasets
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/forms.py
--- a/lib/galaxy/web/controllers/forms.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/forms.py Thu Oct 22 23:02:28 2009 -0400
@@ -39,7 +39,7 @@
show_filter = params.get( 'show_filter', 'Active' )
return self._show_forms_list(trans, msg, messagetype, show_filter)
def _show_forms_list(self, trans, msg, messagetype, show_filter='Active'):
- all_forms = trans.app.model.FormDefinitionCurrent.query().all()
+ all_forms = trans.sa_session.query( trans.app.model.FormDefinitionCurrent )
if show_filter == 'All':
forms_list = all_forms
elif show_filter == 'Deleted':
@@ -109,7 +109,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- fd = trans.app.model.FormDefinition.get(int(util.restore_text( params.form_id )))
+ fd = trans.sa_session.query( trans.app.model.FormDefinition ).get( int( util.restore_text( params.form_id ) ) )
fd.form_definition_current.deleted = True
fd.form_definition_current.flush()
return self._show_forms_list(trans,
@@ -121,7 +121,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- fd = trans.app.model.FormDefinition.get(int(util.restore_text( params.form_id )))
+ fd = trans.sa_session.query( trans.app.model.FormDefinition ).get( int( util.restore_text( params.form_id ) ) )
fd.form_definition_current.deleted = False
fd.form_definition_current.flush()
return self._show_forms_list(trans,
@@ -138,7 +138,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- fd = trans.app.model.FormDefinition.get( int( params.get( 'form_id', None ) ) )
+ fd = trans.sa_session.query( trans.app.model.FormDefinition ).get( int( params.get( 'form_id', None ) ) )
except:
return trans.response.send_redirect( web.url_for( controller='forms',
action='manage',
@@ -444,7 +444,7 @@
if fdc_id: # save changes to the existing form
# change the pointer in the form_definition_current table to point
# to this new record
- fdc = trans.app.model.FormDefinitionCurrent.get(fdc_id)
+ fdc = trans.sa_session.query( trans.app.model.FormDefinitionCurrent ).get( fdc_id )
else: # create a new form
fdc = trans.app.model.FormDefinitionCurrent()
# create corresponding row in the form_definition_current table
@@ -578,11 +578,11 @@
of all the forms from the form_definition table.
'''
if all_versions:
- return trans.app.model.FormDefinition.query().all()
+ return trans.sa_session.query( trans.app.model.FormDefinition )
if filter:
- fdc_list = trans.app.model.FormDefinitionCurrent.query().filter_by(**filter)
+ fdc_list = trans.sa_session.query( trans.app.model.FormDefinitionCurrent ).filter_by( **filter )
else:
- fdc_list = trans.app.model.FormDefinitionCurrent.query().all()
+ fdc_list = trans.sa_session.query( trans.app.model.FormDefinitionCurrent )
if form_type == 'All':
return [ fdc.latest_form for fdc in fdc_list ]
else:
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/genetrack.py
--- a/lib/galaxy/web/controllers/genetrack.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/genetrack.py Thu Oct 22 23:02:28 2009 -0400
@@ -154,7 +154,7 @@
"""
Default search page
"""
- data = trans.app.model.HistoryDatasetAssociation.get( dataset_id )
+ data = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( dataset_id )
if not data:
raise paste.httpexceptions.HTTPRequestRangeNotSatisfiable( "Invalid reference dataset id: %s." % str( dataset_id ) )
# the main configuration file
@@ -202,7 +202,7 @@
Main request handler
"""
color = cycle( [LIGHT, WHITE] )
- data = trans.app.model.HistoryDatasetAssociation.get( dataset_id )
+ data = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( dataset_id )
if not data:
raise paste.httpexceptions.HTTPRequestRangeNotSatisfiable( "Invalid reference dataset id: %s." % str( dataset_id ) )
# the main configuration file
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/history.py
--- a/lib/galaxy/web/controllers/history.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/history.py Thu Oct 22 23:02:28 2009 -0400
@@ -233,9 +233,9 @@
status, message = self._list_undelete( trans, histories )
elif operation == "unshare":
for history in histories:
- husas = trans.app.model.HistoryUserShareAssociation.filter_by( history=history ).all()
- for husa in husas:
- husa.delete()
+ for husa in trans.sa_session.query( trans.app.model.HistoryUserShareAssociation ) \
+ .filter_by( history=history ):
+ trans.sa_session.delete( husa )
elif operation == "enable import via link":
for history in histories:
if not history.importable:
@@ -306,8 +306,9 @@
new_history = histories[0]
galaxy_session = trans.get_galaxy_session()
try:
- association = trans.app.model.GalaxySessionToHistoryAssociation \
- .filter_by( session_id=galaxy_session.id, history_id=trans.security.decode_id( new_history.id ) ).first()
+ association = trans.sa_session.query( trans.app.model.GalaxySessionToHistoryAssociation ) \
+ .filter_by( session_id=galaxy_session.id, history_id=trans.security.decode_id( new_history.id ) ) \
+ .first()
except:
association = None
new_history.add_galaxy_session( galaxy_session, association=association )
@@ -338,8 +339,8 @@
histories = [ get_history( trans, history_id ) for history_id in ids ]
for history in histories:
# Current user is the user with which the histories were shared
- association = trans.app.model.HistoryUserShareAssociation.filter_by( user=trans.user, history=history ).one()
- association.delete()
+ association = trans.sa_session.query( trans.app.model.HistoryUserShareAssociation ).filter_by( user=trans.user, history=history ).one()
+ trans.sa_session.delete( association )
association.flush()
message = "Unshared %d shared histories" % len( ids )
status = 'done'
@@ -360,7 +361,7 @@
return trans.show_ok_message( "History deleted, a new history is active", refresh_frames=['history'] )
@web.expose
def rename_async( self, trans, id=None, new_name=None ):
- history = model.History.get( id )
+ history = trans.sa_session.query( model.History ).get( id )
# Check that the history exists, and is either owned by the current
# user (if logged in) or the current history
assert history is not None
@@ -393,8 +394,9 @@
new_history.user_id = user.id
galaxy_session = trans.get_galaxy_session()
try:
- association = trans.app.model.GalaxySessionToHistoryAssociation \
- .filter_by( session_id=galaxy_session.id, history_id=new_history.id ).first()
+ association = trans.sa_session.query( trans.app.model.GalaxySessionToHistoryAssociation ) \
+ .filter_by( session_id=galaxy_session.id, history_id=new_history.id ) \
+ .first()
except:
association = None
new_history.add_galaxy_session( galaxy_session, association=association )
@@ -410,8 +412,9 @@
new_history.user_id = None
galaxy_session = trans.get_galaxy_session()
try:
- association = trans.app.model.GalaxySessionToHistoryAssociation \
- .filter_by( session_id=galaxy_session.id, history_id=new_history.id ).first()
+ association = trans.sa_session.query( trans.app.model.GalaxySessionToHistoryAssociation ) \
+ .filter_by( session_id=galaxy_session.id, history_id=new_history.id ) \
+ .first()
except:
association = None
new_history.add_galaxy_session( galaxy_session, association=association )
@@ -443,10 +446,10 @@
# View history.
query = trans.sa_session.query( model.HistoryDatasetAssociation ) \
- .filter( model.HistoryDatasetAssociation.history == history_to_view ) \
- .options( eagerload( "children" ) ) \
- .join( "dataset" ).filter( model.Dataset.purged == False ) \
- .options( eagerload_all( "dataset.actions" ) )
+ .filter( model.HistoryDatasetAssociation.history == history_to_view ) \
+ .options( eagerload( "children" ) ) \
+ .join( "dataset" ).filter( model.Dataset.purged == False ) \
+ .options( eagerload_all( "dataset.actions" ) )
# Do not show deleted datasets.
query = query.filter( model.HistoryDatasetAssociation.deleted == False )
user_owns_history = ( trans.get_user() == history_to_view.user )
@@ -547,10 +550,10 @@
for send_to_user, history_dict in can_change.items():
for history in history_dict:
# Make sure the current history has not already been shared with the current send_to_user
- if trans.app.model.HistoryUserShareAssociation \
- .filter( and_( trans.app.model.HistoryUserShareAssociation.table.c.user_id == send_to_user.id,
- trans.app.model.HistoryUserShareAssociation.table.c.history_id == history.id ) ) \
- .count() > 0:
+ if trans.sa_session.query( trans.app.model.HistoryUserShareAssociation ) \
+ .filter( and_( trans.app.model.HistoryUserShareAssociation.table.c.user_id == send_to_user.id,
+ trans.app.model.HistoryUserShareAssociation.table.c.history_id == history.id ) ) \
+ .count() > 0:
send_to_err += "History (%s) already shared with user (%s)" % ( history.name, send_to_user.email )
else:
# Only deal with datasets that have not been purged
@@ -590,8 +593,10 @@
if email_address == user.email:
send_to_err += "You cannot send histories to yourself. "
else:
- send_to_user = trans.app.model.User.filter( and_( trans.app.model.User.table.c.email==email_address,
- trans.app.model.User.table.c.deleted==False ) ).first()
+ send_to_user = trans.sa_session.query( trans.app.model.User ) \
+ .filter( and_( trans.app.model.User.table.c.email==email_address,
+ trans.app.model.User.table.c.deleted==False ) ) \
+ .first()
if send_to_user:
send_to_users.append( send_to_user )
else:
@@ -608,10 +613,10 @@
for send_to_user, history_dict in other.items():
for history in history_dict:
# Make sure the current history has not already been shared with the current send_to_user
- if trans.app.model.HistoryUserShareAssociation \
- .filter( and_( trans.app.model.HistoryUserShareAssociation.table.c.user_id == send_to_user.id,
- trans.app.model.HistoryUserShareAssociation.table.c.history_id == history.id ) ) \
- .count() > 0:
+ if trans.sa_session.query( trans.app.model.HistoryUserShareAssociation ) \
+ .filter( and_( trans.app.model.HistoryUserShareAssociation.table.c.user_id == send_to_user.id,
+ trans.app.model.HistoryUserShareAssociation.table.c.history_id == history.id ) ) \
+ .count() > 0:
send_to_err += "History (%s) already shared with user (%s)" % ( history.name, send_to_user.email )
else:
# Build the dict that will be used for sharing
@@ -640,10 +645,10 @@
for history in histories:
for send_to_user in send_to_users:
# Make sure the current history has not already been shared with the current send_to_user
- if trans.app.model.HistoryUserShareAssociation \
- .filter( and_( trans.app.model.HistoryUserShareAssociation.table.c.user_id == send_to_user.id,
- trans.app.model.HistoryUserShareAssociation.table.c.history_id == history.id ) ) \
- .count() > 0:
+ if trans.sa_session.query( trans.app.model.HistoryUserShareAssociation ) \
+ .filter( and_( trans.app.model.HistoryUserShareAssociation.table.c.user_id == send_to_user.id,
+ trans.app.model.HistoryUserShareAssociation.table.c.history_id == history.id ) ) \
+ .count() > 0:
send_to_err += "History (%s) already shared with user (%s)" % ( history.name, send_to_user.email )
else:
# Only deal with datasets that have not been purged
@@ -748,14 +753,14 @@
history.importable = False
history.flush()
elif params.get( 'unshare_user', False ):
- user = trans.app.model.User.get( trans.security.decode_id( kwd[ 'unshare_user' ] ) )
+ user = trans.sa_session.query( trans.app.model.User ).get( trans.security.decode_id( kwd[ 'unshare_user' ] ) )
if not user:
msg = 'History (%s) does not seem to be shared with user (%s)' % ( history.name, user.email )
return trans.fill_template( 'history/sharing.mako', histories=histories, msg=msg, messagetype='error' )
- husas = trans.app.model.HistoryUserShareAssociation.filter_by( user=user, history=history ).all()
+ husas = trans.sa_session.query( trans.app.model.HistoryUserShareAssociation ).filter_by( user=user, history=history ).all()
if husas:
for husa in husas:
- husa.delete()
+ trans.sa_session.delete( husa )
husa.flush()
histories = []
# Get all histories that have been shared with others
@@ -763,8 +768,7 @@
.join( "history" ) \
.filter( and_( trans.app.model.History.user == trans.user,
trans.app.model.History.deleted == False ) ) \
- .order_by( trans.app.model.History.table.c.name ) \
- .all()
+ .order_by( trans.app.model.History.table.c.name )
for husa in husas:
history = husa.history
if history not in histories:
@@ -772,8 +776,7 @@
# Get all histories that are importable
importables = trans.sa_session.query( trans.app.model.History ) \
.filter_by( user=trans.user, importable=True, deleted=False ) \
- .order_by( trans.app.model.History.table.c.name ) \
- .all()
+ .order_by( trans.app.model.History.table.c.name )
for importable in importables:
if importable not in histories:
histories.append( importable )
@@ -843,7 +846,8 @@
owner = True
else:
if trans.sa_session.query( trans.app.model.HistoryUserShareAssociation ) \
- .filter_by( user=user, history=history ).count() == 0:
+ .filter_by( user=user, history=history ) \
+ .count() == 0:
return trans.show_error_message( "The history you are attempting to clone is not owned by you or shared with you. " )
owner = False
name = "Clone of '%s'" % history.name
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/library.py
--- a/lib/galaxy/web/controllers/library.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/library.py Thu Oct 22 23:02:28 2009 -0400
@@ -63,8 +63,9 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
user, roles = trans.get_user_and_roles()
- all_libraries = trans.app.model.Library.filter( trans.app.model.Library.table.c.deleted==False ) \
- .order_by( trans.app.model.Library.name ).all()
+ all_libraries = trans.sa_session.query( trans.app.model.Library ) \
+ .filter( trans.app.model.Library.table.c.deleted==False ) \
+ .order_by( trans.app.model.Library.name )
library_actions = [ trans.app.security_agent.permitted_actions.LIBRARY_ADD,
trans.app.security_agent.permitted_actions.LIBRARY_MODIFY,
trans.app.security_agent.permitted_actions.LIBRARY_MANAGE ]
@@ -102,7 +103,7 @@
default_action=params.get( 'default_action', None ),
msg=util.sanitize_text( msg ),
messagetype='error' ) )
- library = library=trans.app.model.Library.get( library_id )
+ library = trans.sa_session.query( trans.app.model.Library ).get( library_id )
if not library:
# To handle bots
msg = "Invalid library id ( %s )." % str( library_id )
@@ -144,7 +145,7 @@
action='browse_libraries',
msg=util.sanitize_text( msg ),
messagetype='error' ) )
- library = trans.app.model.Library.get( int( library_id ) )
+ library = trans.sa_session.query( trans.app.model.Library ).get( int( library_id ) )
if not library:
msg = "Invalid library id ( %s ) specified." % str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library',
@@ -190,10 +191,10 @@
# The user clicked the Save button on the 'Associate With Roles' form
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
trans.app.security_agent.set_all_library_permissions( library, permissions )
- library.refresh()
+ trans.sa_session.refresh( library )
# Copy the permissions to the root folder
trans.app.security_agent.copy_library_permissions( library, library.root_folder )
msg = "Permissions updated for library '%s'" % library.name
@@ -221,7 +222,7 @@
else:
# 'information' will be the default
action = 'information'
- folder = trans.app.model.LibraryFolder.get( int( obj_id ) )
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( int( obj_id ) )
if not folder:
msg = "Invalid folder specified, id: %s" % str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library',
@@ -301,10 +302,10 @@
if trans.app.security_agent.can_manage_library_item( user, roles, folder ):
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( int( x ) ) for x in util.listify( params.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( int( x ) ) for x in util.listify( params.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
trans.app.security_agent.set_all_library_permissions( folder, permissions )
- folder.refresh()
+ trans.sa_session.refresh( folder )
msg = 'Permissions updated for folder %s' % folder.name
return trans.response.send_redirect( web.url_for( controller='library',
action='folder',
@@ -336,7 +337,7 @@
action = 'permissions'
else:
action = 'information'
- library_dataset = trans.app.model.LibraryDataset.get( obj_id )
+ library_dataset = trans.sa_session.query( trans.app.model.LibraryDataset ).get( obj_id )
if not library_dataset:
msg = "Invalid library dataset specified, id: %s" %str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library',
@@ -375,15 +376,15 @@
# The user clicked the Save button on the 'Associate With Roles' form
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
# Set the LIBRARY permissions on the LibraryDataset
# NOTE: the LibraryDataset and LibraryDatasetDatasetAssociation will be set with the same permissions
trans.app.security_agent.set_all_library_permissions( library_dataset, permissions )
- library_dataset.refresh()
+ trans.sa_session.refresh( library_dataset )
# Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
trans.app.security_agent.set_all_library_permissions( library_dataset.library_dataset_dataset_association, permissions )
- library_dataset.library_dataset_dataset_association.refresh()
+ trans.sa_session.refresh( library_dataset.library_dataset_dataset_association )
msg = 'Permissions and roles have been updated for library dataset %s' % library_dataset.name
messagetype = 'done'
else:
@@ -399,7 +400,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( obj_id )
if not ldda:
msg = "Invalid LibraryDatasetDatasetAssociation specified, obj_id: %s" % str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library',
@@ -535,7 +536,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( obj_id )
if not ldda:
msg = "Invalid LibraryDatasetDatasetAssociation specified, id: %s" % str( obj_id )
return trans.response.send_redirect( web.url_for( controller='admin',
@@ -560,7 +561,7 @@
# Display permission form, permissions will be updated for all lddas simultaneously.
lddas = []
for obj_id in [ int( obj_id ) for obj_id in obj_ids ]:
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( obj_id )
if ldda is None:
msg = 'You specified an invalid LibraryDatasetDatasetAssociation id: %s' %str( obj_id )
trans.response.send_redirect( web.url_for( controller='library',
@@ -574,24 +575,24 @@
trans.app.security_agent.can_manage_dataset( roles, ldda.dataset ):
permissions = {}
for k, v in trans.app.model.Dataset.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
for ldda in lddas:
# Set the DATASET permissions on the Dataset
trans.app.security_agent.set_all_dataset_permissions( ldda.dataset, permissions )
- ldda.dataset.refresh()
+ trans.sa_session.refresh( ldda.dataset )
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
for ldda in lddas:
# Set the LIBRARY permissions on the LibraryDataset
# NOTE: the LibraryDataset and LibraryDatasetDatasetAssociation will be set with the same permissions
trans.app.security_agent.set_all_library_permissions( ldda.library_dataset, permissions )
- ldda.library_dataset.refresh()
+ trans.sa_session.refresh( ldda.library_dataset )
# Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
trans.app.security_agent.set_all_library_permissions( ldda, permissions )
- ldda.refresh()
+ trans.sa_session.refresh( ldda )
msg = 'Permissions and roles have been updated on %d datasets' % len( lddas )
messagetype = 'done'
else:
@@ -645,12 +646,12 @@
last_used_build = dbkey[0]
else:
last_used_build = dbkey
- folder = trans.app.model.LibraryFolder.get( folder_id )
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( folder_id )
if folder and last_used_build in [ 'None', None, '?' ]:
last_used_build = folder.genome_build
replace_id = params.get( 'replace_id', None )
if replace_id not in [ None, 'None' ]:
- replace_dataset = trans.app.model.LibraryDataset.get( params.get( 'replace_id', None ) )
+ replace_dataset = trans.sa_session.query( trans.app.model.LibraryDataset ).get( replace_id )
if not last_used_build:
last_used_build = replace_dataset.library_dataset_dataset_association.dbkey
# Don't allow multiple datasets to be uploaded when replacing a dataset with a new version
@@ -691,7 +692,7 @@
msg = "Added %d datasets to the folder '%s' ( each is selected ). " % ( total_added, folder.name )
# Since permissions on all LibraryDatasetDatasetAssociations must be the same at this point, we only need
# to check one of them to see if the current user can manage permissions on them.
- check_ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_id_list[0] )
+ check_ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( ldda_id_list[0] )
if trans.app.security_agent.can_manage_library_item( user, roles, check_ldda ):
if replace_dataset:
default_action = ''
@@ -728,10 +729,12 @@
yield build_name, dbkey, ( dbkey==last_used_build )
dbkeys = get_dbkey_options( last_used_build )
# Send list of roles to the form so the dataset can be associated with 1 or more of them.
- roles = trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.c.name ).all()
+ roles = trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name )
# Send the current history to the form to enable importing datasets from history to library
history = trans.get_history()
- history.refresh()
+ trans.sa_session.refresh( history )
# If we're using nginx upload, override the form action
action = web.url_for( controller='library', action='upload_library_dataset' )
if upload_option == 'upload_file' and trans.app.config.nginx_upload_path:
@@ -756,7 +759,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- folder = trans.app.model.LibraryFolder.get( int( folder_id ) )
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( int( folder_id ) )
except:
msg = "Invalid folder id: %s" % str( folder_id )
return trans.response.send_redirect( web.url_for( controller='library',
@@ -766,12 +769,12 @@
messagetype='error' ) )
replace_id = params.get( 'replace_id', None )
if replace_id:
- replace_dataset = trans.app.model.LibraryDataset.get( replace_id )
+ replace_dataset = trans.sa_session.query( trans.app.model.LibraryDataset ).get( replace_id )
else:
replace_dataset = None
# See if the current history is empty
history = trans.get_history()
- history.refresh()
+ trans.sa_session.refresh( history )
if not history.active_datasets:
msg = 'Your current history is empty'
return trans.response.send_redirect( web.url_for( controller='library',
@@ -785,7 +788,7 @@
dataset_names = []
created_ldda_ids = ''
for hda_id in hda_ids:
- hda = trans.app.model.HistoryDatasetAssociation.get( hda_id )
+ hda = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( hda_id )
if hda:
ldda = hda.to_library_dataset_dataset_association( target_folder=folder, replace_dataset=replace_dataset )
created_ldda_ids = '%s,%s' % ( created_ldda_ids, str( ldda.id ) )
@@ -818,7 +821,7 @@
msg = "Added %d datasets to the folder '%s' ( each is selected ). " % ( total_added, folder.name )
# Since permissions on all LibraryDatasetDatasetAssociations must be the same at this point, we only need
# to check one of them to see if the current user can manage permissions on them.
- check_ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_id_list[0] )
+ check_ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( ldda_id_list[0] )
user, roles = trans.get_user_and_roles()
if trans.app.security_agent.can_manage_library_item( user, roles, check_ldda ):
if replace_dataset:
@@ -848,7 +851,9 @@
yield build_name, dbkey, ( dbkey==last_used_build )
dbkeys = get_dbkey_options( last_used_build )
# Send list of roles to the form so the dataset can be associated with 1 or more of them.
- roles = trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.c.name ).all()
+ roles = trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name )
return trans.fill_template( "/library/upload.mako",
upload_option=upload_option,
library_id=library_id,
@@ -887,7 +892,7 @@
if params.do_action == 'add':
history = trans.get_history()
for ldda_id in ldda_ids:
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( ldda_id )
hda = ldda.to_history_dataset_association( target_history=history, add_to_history = True )
history.flush()
msg = "%i dataset(s) have been imported into your history" % len( ldda_ids )
@@ -898,7 +903,7 @@
messagetype='done' ) )
elif params.do_action == 'manage_permissions':
# We need the folder containing the LibraryDatasetDatasetAssociation(s)
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_ids[0] )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( ldda_ids[0] )
trans.response.send_redirect( web.url_for( controller='library',
action='upload_library_dataset',
library_id=library_id,
@@ -933,7 +938,7 @@
seen = []
user, roles = trans.get_user_and_roles()
for ldda_id in ldda_ids:
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( ldda_id )
if not ldda or not trans.app.security_agent.can_access_dataset( roles, ldda.dataset ):
continue
path = ""
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/library_admin.py
--- a/lib/galaxy/web/controllers/library_admin.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/library_admin.py Thu Oct 22 23:02:28 2009 -0400
@@ -19,8 +19,9 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
return trans.fill_template( '/admin/library/browse_libraries.mako',
- libraries=trans.app.model.Library.filter( trans.app.model.Library.table.c.deleted==False ) \
- .order_by( trans.app.model.Library.name ).all(),
+ libraries=trans.sa_session.query( trans.app.model.Library ) \
+ .filter( trans.app.model.Library.table.c.deleted==False ) \
+ .order_by( trans.app.model.Library.name ),
deleted=False,
show_deleted=False,
msg=msg,
@@ -41,7 +42,7 @@
messagetype='error' ) )
deleted = util.string_as_bool( params.get( 'deleted', False ) )
show_deleted = util.string_as_bool( params.get( 'show_deleted', False ) )
- library = library=trans.app.model.Library.get( library_id )
+ library = trans.sa_session.query( trans.app.model.Library ).get( library_id )
if not library:
msg = "Invalid library id ( %s )." % str( library_id )
return trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -82,7 +83,7 @@
msg=util.sanitize_text( msg ),
messagetype='error' ) )
if not action == 'new':
- library = trans.app.model.Library.get( int( library_id ) )
+ library = trans.sa_session.query( trans.app.model.Library ).get( int( library_id ) )
if action == 'new':
if params.new == 'submitted':
library = trans.app.model.Library( name = util.restore_text( params.name ),
@@ -134,14 +135,14 @@
messagetype=messagetype )
elif action == 'delete':
def delete_folder( library_folder ):
- library_folder.refresh()
+ trans.sa_session.refresh( library_folder )
for folder in library_folder.folders:
delete_folder( folder )
for library_dataset in library_folder.datasets:
- library_dataset.refresh()
+ trans.sa_session.refresh( library_dataset )
ldda = library_dataset.library_dataset_dataset_association
if ldda:
- ldda.refresh()
+ trans.sa_session.refresh( ldda )
# We don't set ldda.dataset.deleted to True here because the cleanup_dataset script
# will eventually remove it from disk. The purge_library method below sets the dataset
# to deleted. This allows for the library to be undeleted ( before it is purged ),
@@ -152,7 +153,7 @@
library_dataset.flush()
library_folder.deleted = True
library_folder.flush()
- library.refresh()
+ trans.sa_session.refresh( library )
delete_folder( library.root_folder )
library.deleted = True
library.flush()
@@ -163,10 +164,10 @@
# The user clicked the Save button on the 'Associate With Roles' form
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
trans.app.security_agent.set_all_library_permissions( library, permissions )
- library.refresh()
+ trans.sa_session.refresh( library )
# Copy the permissions to the root folder
trans.app.security_agent.copy_library_permissions( library, library.root_folder )
msg = "Permissions updated for library '%s'" % library.name
@@ -186,9 +187,10 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- libraries=trans.app.model.Library.filter( and_( trans.app.model.Library.table.c.deleted==True,
- trans.app.model.Library.table.c.purged==False ) ) \
- .order_by( trans.app.model.Library.table.c.name ).all()
+ libraries = trans.sa_session.query( trans.app.model.Library ) \
+ .filter( and_( trans.app.model.Library.table.c.deleted==True,
+ trans.app.model.Library.table.c.purged==False ) ) \
+ .order_by( trans.app.model.Library.table.c.name )
return trans.fill_template( '/admin/library/browse_libraries.mako',
libraries=libraries,
deleted=True,
@@ -199,18 +201,18 @@
@web.require_admin
def purge_library( self, trans, **kwd ):
params = util.Params( kwd )
- library = trans.app.model.Library.get( int( params.obj_id ) )
+ library = trans.sa_session.query( trans.app.model.Library ).get( int( params.obj_id ) )
def purge_folder( library_folder ):
for lf in library_folder.folders:
purge_folder( lf )
- library_folder.refresh()
+ trans.sa_session.refresh( library_folder )
for library_dataset in library_folder.datasets:
- library_dataset.refresh()
+ trans.sa_session.refresh( library_dataset )
ldda = library_dataset.library_dataset_dataset_association
if ldda:
- ldda.refresh()
+ trans.sa_session.refresh( ldda )
dataset = ldda.dataset
- dataset.refresh()
+ trans.sa_session.refresh( dataset )
# If the dataset is not associated with any additional undeleted folders, then we can delete it.
# We don't set dataset.purged to True here because the cleanup_datasets script will do that for
# us, as well as removing the file from disk.
@@ -254,7 +256,7 @@
else:
# 'information' will be the default
action = 'information'
- folder = trans.app.model.LibraryFolder.get( int( obj_id ) )
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( int( obj_id ) )
if not folder:
msg = "Invalid folder specified, id: %s" % str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -331,10 +333,10 @@
# The user clicked the Save button on the 'Associate With Roles' form
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( int( x ) ) for x in util.listify( params.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( int( x ) ) for x in util.listify( params.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
trans.app.security_agent.set_all_library_permissions( folder, permissions )
- folder.refresh()
+ trans.sa_session.refresh( folder )
msg = "Permissions updated for folder '%s'" % folder.name
return trans.response.send_redirect( web.url_for( controller='library_admin',
action='folder',
@@ -358,7 +360,7 @@
action = 'permissions'
else:
action = 'information'
- library_dataset = trans.app.model.LibraryDataset.get( obj_id )
+ library_dataset = trans.sa_session.query( trans.app.model.LibraryDataset ).get( obj_id )
if not library_dataset:
msg = "Invalid library dataset specified, id: %s" %str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -390,15 +392,15 @@
# The user clicked the Save button on the 'Edit permissions and role associations' form
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
# Set the LIBRARY permissions on the LibraryDataset
# NOTE: the LibraryDataset and LibraryDatasetDatasetAssociation will be set with the same permissions
trans.app.security_agent.set_all_library_permissions( library_dataset, permissions )
- library_dataset.refresh()
+ trans.sa_session.refresh( library_dataset )
# Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
trans.app.security_agent.set_all_library_permissions( library_dataset.library_dataset_dataset_association, permissions )
- library_dataset.library_dataset_dataset_association.refresh()
+ trans.sa_session.refresh( library_dataset.library_dataset_dataset_association )
msg = 'Permissions and roles have been updated for library dataset %s' % library_dataset.name
return trans.fill_template( '/admin/library/library_dataset_permissions.mako',
library_dataset=library_dataset,
@@ -411,7 +413,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( obj_id )
if not ldda:
msg = "Invalid LibraryDatasetDatasetAssociation specified, obj_id: %s" % str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -528,7 +530,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
show_deleted = util.string_as_bool( params.get( 'show_deleted', False ) )
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( obj_id )
if not ldda:
msg = "Invalid LibraryDatasetDatasetAssociation specified, obj_id: %s" % str( obj_id )
return trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -555,7 +557,7 @@
# Display permission form, permissions will be updated for all lddas simultaneously.
lddas = []
for obj_id in [ int( obj_id ) for obj_id in obj_ids ]:
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( obj_id )
if ldda is None:
msg = 'You specified an invalid LibraryDatasetDatasetAssociation obj_id: %s' %str( obj_id )
trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -568,7 +570,7 @@
permissions = {}
accessible = False
for k, v in trans.app.model.Dataset.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( params.get( k + '_in', [] ) ) ]
# At least 1 user must have every role associated with this dataset, or the dataset is inaccessible
if v == trans.app.security_agent.permitted_actions.DATASET_ACCESS:
if len( in_roles ) > 1:
@@ -599,19 +601,19 @@
for ldda in lddas:
# Set the DATASET permissions on the Dataset
trans.app.security_agent.set_all_dataset_permissions( ldda.dataset, permissions )
- ldda.dataset.refresh()
+ trans.sa_session.refresh( ldda.dataset )
permissions = {}
for k, v in trans.app.model.Library.permitted_actions.items():
- in_roles = [ trans.app.model.Role.get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
+ in_roles = [ trans.sa_session.query( trans.app.model.Role ).get( x ) for x in util.listify( kwd.get( k + '_in', [] ) ) ]
permissions[ trans.app.security_agent.get_action( v.action ) ] = in_roles
for ldda in lddas:
# Set the LIBRARY permissions on the LibraryDataset
# NOTE: the LibraryDataset and LibraryDatasetDatasetAssociation will be set with the same permissions
trans.app.security_agent.set_all_library_permissions( ldda.library_dataset, permissions )
- ldda.library_dataset.refresh()
+ trans.sa_session.refresh( ldda.library_dataset )
# Set the LIBRARY permissions on the LibraryDatasetDatasetAssociation
trans.app.security_agent.set_all_library_permissions( ldda, permissions )
- ldda.refresh()
+ trans.sa_session.refresh( ldda )
if not accessible:
msg = "At least 1 user must have every role associated with accessing these %d datasets. " % len( lddas )
msg += "The roles you attempted to associate for access would make these datasets inaccessible by everyone, "
@@ -666,12 +668,12 @@
last_used_build = dbkey[0]
else:
last_used_build = dbkey
- folder = trans.app.model.LibraryFolder.get( folder_id )
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( folder_id )
if folder and last_used_build in [ 'None', None, '?' ]:
last_used_build = folder.genome_build
replace_id = params.get( 'replace_id', None )
if replace_id not in [ None, 'None' ]:
- replace_dataset = trans.app.model.LibraryDataset.get( int( replace_id ) )
+ replace_dataset = trans.sa_session.query( trans.app.model.LibraryDataset ).get( int( replace_id ) )
if not last_used_build:
last_used_build = replace_dataset.library_dataset_dataset_association.dbkey
# Don't allow multiple datasets to be uploaded when replacing a dataset with a new version
@@ -729,10 +731,12 @@
yield build_name, dbkey, ( dbkey==last_used_build )
dbkeys = get_dbkey_options( last_used_build )
# Send list of roles to the form so the dataset can be associated with 1 or more of them.
- roles = trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.c.name ).all()
+ roles = trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name )
# Send the current history to the form to enable importing datasets from history to library
history = trans.get_history()
- history.refresh()
+ trans.sa_session.refresh( history )
# If we're using nginx upload, override the form action
action = web.url_for( controller='library_admin', action='upload_library_dataset' )
if upload_option == 'upload_file' and trans.app.config.nginx_upload_path:
@@ -758,7 +762,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- folder = trans.app.model.LibraryFolder.get( int( folder_id ) )
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( int( folder_id ) )
except:
msg = "Invalid folder id: %s" % str( folder_id )
return trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -768,12 +772,12 @@
messagetype='error' ) )
replace_id = params.get( 'replace_id', None )
if replace_id:
- replace_dataset = trans.app.model.LibraryDataset.get( replace_id )
+ replace_dataset = trans.sa_session.query( trans.app.model.LibraryDataset ).get( replace_id )
else:
replace_dataset = None
# See if the current history is empty
history = trans.get_history()
- history.refresh()
+ trans.sa_session.refresh( history )
if not history.active_datasets:
msg = 'Your current history is empty'
return trans.response.send_redirect( web.url_for( controller='library_admin',
@@ -787,7 +791,7 @@
dataset_names = []
created_ldda_ids = ''
for hda_id in hda_ids:
- hda = trans.app.model.HistoryDatasetAssociation.get( hda_id )
+ hda = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( hda_id )
if hda:
ldda = hda.to_library_dataset_dataset_association( target_folder=folder, replace_dataset=replace_dataset )
created_ldda_ids = '%s,%s' % ( created_ldda_ids, str( ldda.id ) )
@@ -838,7 +842,9 @@
yield build_name, dbkey, ( dbkey==last_used_build )
dbkeys = get_dbkey_options( last_used_build )
# Send list of roles to the form so the dataset can be associated with 1 or more of them.
- roles = trans.app.model.Role.filter( trans.app.model.Role.table.c.deleted==False ).order_by( trans.app.model.Role.c.name ).all()
+ roles = trans.sa_session.query( trans.app.model.Role ) \
+ .filter( trans.app.model.Role.table.c.deleted==False ) \
+ .order_by( trans.app.model.Role.table.c.name )
return trans.fill_template( "/admin/library/upload.mako",
upload_option=upload_option,
library_id=library_id,
@@ -871,7 +877,7 @@
messagetype='error' ) )
if params.action == 'manage_permissions':
# We need the folder containing the LibraryDatasetDatasetAssociation(s)
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( int( ldda_ids[0] ) )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( int( ldda_ids[0] ) )
trans.response.send_redirect( web.url_for( controller='library_admin',
action='ldda_manage_permissions',
library_id=library_id,
@@ -881,7 +887,7 @@
messagetype=messagetype ) )
elif params.action == 'delete':
for ldda_id in ldda_ids:
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( ldda_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( ldda_id )
ldda.deleted = True
ldda.flush()
msg = "The selected datasets have been removed from this data library"
@@ -918,7 +924,7 @@
library_item_desc = 'Dataset'
else:
library_item_desc = library_item_type.capitalize()
- library_item = library_item_types[ library_item_type ].get( int( library_item_id ) )
+ library_item = trans.sa_session.query( library_item_types[ library_item_type ] ).get( int( library_item_id ) )
library_item.deleted = True
library_item.flush()
msg = util.sanitize_text( "%s '%s' has been marked deleted" % ( library_item_desc, library_item.name ) )
@@ -945,7 +951,7 @@
library_item_desc = 'Dataset'
else:
library_item_desc = library_item_type.capitalize()
- library_item = library_item_types[ library_item_type ].get( int( library_item_id ) )
+ library_item = trans.sa_session.query( library_item_types[ library_item_type ] ).get( int( library_item_id ) )
if library_item.purged:
msg = '%s %s has been purged, so it cannot be undeleted' % ( library_item_desc, library_item.name )
messagetype = 'error'
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/library_common.py
--- a/lib/galaxy/web/controllers/library_common.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/library_common.py Thu Oct 22 23:02:28 2009 -0400
@@ -22,7 +22,7 @@
ids = map( int, ids.split( "," ) )
states = states.split( "," )
for id, state in zip( ids, states ):
- data = self.app.model.LibraryDatasetDatasetAssociation.get( id )
+ data = trans.sa_session.query( self.app.model.LibraryDatasetDatasetAssociation ).get( id )
if data.state != state:
job_ldda = data
while job_ldda.copied_from_library_dataset_dataset_association:
@@ -184,7 +184,7 @@
def download_dataset_from_folder( self, trans, cntrller, obj_id, library_id=None, **kwd ):
"""Catches the dataset id and displays file contents as directed"""
# id must refer to a LibraryDatasetDatasetAssociation object
- ldda = trans.app.model.LibraryDatasetDatasetAssociation.get( obj_id )
+ ldda = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( obj_id )
if not ldda.dataset:
msg = 'Invalid LibraryDatasetDatasetAssociation id %s received for file downlaod' % str( obj_id )
return trans.response.send_redirect( web.url_for( controller=cntrller,
@@ -218,19 +218,19 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
if obj_id:
- library_item = trans.app.model.FormDefinition.get( int( obj_id ) )
+ library_item = trans.sa_session.query( trans.app.model.FormDefinition ).get( int( obj_id ) )
library_item_desc = 'information template'
response_id = obj_id
elif folder_id:
- library_item = trans.app.model.LibraryFolder.get( int( folder_id ) )
+ library_item = trans.sa_session.query( trans.app.model.LibraryFolder ).get( int( folder_id ) )
library_item_desc = 'folder'
response_id = folder_id
elif ldda_id:
- library_item = trans.app.model.LibraryDatasetDatasetAssociation.get( int( ldda_id ) )
+ library_item = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( int( ldda_id ) )
library_item_desc = 'library dataset'
response_id = ldda_id
else:
- library_item = trans.app.model.Library.get( int( library_id ) )
+ library_item = trans.sa_session.query( trans.app.model.Library ).get( int( library_id ) )
library_item_desc = 'library'
response_id = library_id
forms = get_all_forms( trans,
@@ -244,7 +244,7 @@
msg=msg,
messagetype='done' ) )
if params.get( 'add_info_template_button', False ):
- form = trans.app.model.FormDefinition.get( int( kwd[ 'form_id' ] ) )
+ form = trans.sa_session.query( trans.app.model.FormDefinition ).get( int( kwd[ 'form_id' ] ) )
#fields = list( copy.deepcopy( form.fields ) )
form_values = trans.app.model.FormValues( form, [] )
form_values.flush()
@@ -280,13 +280,13 @@
messagetype = params.get( 'messagetype', 'done' )
folder_id = None
if library_item_type == 'library':
- library_item = trans.app.model.Library.get( library_item_id )
+ library_item = trans.sa_session.query( trans.app.model.Library ).get( library_item_id )
elif library_item_type == 'library_dataset':
- library_item = trans.app.model.LibraryDataset.get( library_item_id )
+ library_item = trans.sa_session.query( trans.app.model.LibraryDataset ).get( library_item_id )
elif library_item_type == 'folder':
- library_item = trans.app.model.LibraryFolder.get( library_item_id )
+ library_item = trans.sa_session.query( trans.app.model.LibraryFolder ).get( library_item_id )
elif library_item_type == 'library_dataset_dataset_association':
- library_item = trans.app.model.LibraryDatasetDatasetAssociation.get( library_item_id )
+ library_item = trans.sa_session.query( trans.app.model.LibraryDatasetDatasetAssociation ).get( library_item_id )
# This response_action method requires a folder_id
folder_id = library_item.library_dataset.folder.id
else:
@@ -310,7 +310,7 @@
if info_association:
template = info_association.template
info = info_association.info
- form_values = trans.app.model.FormValues.get( info.id )
+ form_values = trans.sa_session.query( trans.app.model.FormValues ).get( info.id )
# Update existing content only if it has changed
if form_values.content != field_contents:
form_values.content = field_contents
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/mobile.py
--- a/lib/galaxy/web/controllers/mobile.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/mobile.py Thu Oct 22 23:02:28 2009 -0400
@@ -11,19 +11,19 @@
@web.expose
def history_detail( self, trans, id ):
- history = trans.app.model.History.get( id )
+ history = trans.sa_session.query( trans.app.model.History ).get( id )
assert history.user == trans.user
return trans.fill_template( "mobile/history/detail.mako", history=history )
@web.expose
def dataset_detail( self, trans, id ):
- dataset = trans.app.model.HistoryDatasetAssociation.get( id )
+ dataset = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( id )
assert dataset.history.user == trans.user
return trans.fill_template( "mobile/dataset/detail.mako", dataset=dataset )
@web.expose
def dataset_peek( self, trans, id ):
- dataset = trans.app.model.HistoryDatasetAssociation.get( id )
+ dataset = trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( id )
assert dataset.history.user == trans.user
return trans.fill_template( "mobile/dataset/peek.mako", dataset=dataset )
@@ -45,7 +45,7 @@
def __login( self, trans, email="", password="" ):
error = password_error = None
- user = model.User.filter( model.User.table.c.email==email ).first()
+ user = trans.sa_session.query( model.User ).filter_by( email = email ).first()
if not user:
error = "No such user"
elif user.deleted:
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/page.py
--- a/lib/galaxy/web/controllers/page.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/page.py Thu Oct 22 23:02:28 2009 -0400
@@ -65,7 +65,7 @@
page_slug_err = "Page id is required"
elif not VALID_SLUG_RE.match( page_slug ):
page_slug_err = "Page identifier must consist of only lowercase letters, numbers, and the '-' character"
- elif model.Page.filter_by( user=user, slug=page_slug ).first():
+ elif trans.sa_session.query( model.Page ).filter_by( user=user, slug=page_slug ).first():
page_slug_err = "Page id must be unique"
else:
# Create the new stored workflow
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/requests.py
--- a/lib/galaxy/web/controllers/requests.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/requests.py Thu Oct 22 23:02:28 2009 -0400
@@ -108,7 +108,7 @@
def __show_request(self, trans, id, add_sample=False):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
return trans.response.send_redirect( web.url_for( controller='requests',
action='list',
@@ -130,7 +130,7 @@
'''
Shows the request details
'''
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
# list of widgets to be rendered on the request form
request_details = []
# main details
@@ -171,7 +171,7 @@
if field['type'] == 'AddressField':
if request.values.content[index]:
request_details.append(dict(label=field['label'],
- value=trans.app.model.UserAddress.get(int(request.values.content[index])).get_html(),
+ value=trans.sa_session.query( trans.app.model.UserAddress ).get( int( request.values.content[index] ) ).get_html(),
helptext=field['helptext']+' ('+req+')'))
else:
request_details.append(dict(label=field['label'],
@@ -220,7 +220,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- request = trans.app.model.Request.get(int(params.get('request_id', None)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', None ) ) )
except:
return trans.response.send_redirect( web.url_for( controller='requests',
action='list',
@@ -313,7 +313,7 @@
sample_values.append(util.restore_text( params.get( 'sample_%i_field_%i' % (sample_index, field_index), '' ) ))
sample = request.has_sample(sample_name)
if sample:
- form_values = trans.app.model.FormValues.get(sample.values.id)
+ form_values = trans.sa_session.query( trans.app.model.FormValues ).get( sample.values.id )
form_values.content = sample_values
form_values.flush()
sample.name = new_sample_name
@@ -344,13 +344,13 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- request = trans.app.model.Request.get(int(params.get('request_id', 0)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', 0 ) ) )
current_samples, details, edit_mode = self.__update_samples( request, **kwd )
sample_index = int(params.get('sample_id', 0))
sample_name = current_samples[sample_index][0]
s = request.has_sample(sample_name)
if s:
- s.delete()
+ trans.sa_session.delete( s )
s.flush()
request.flush()
del current_samples[sample_index]
@@ -368,7 +368,8 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- request = trans.app.model.Request.get(int(params.get('request_id', 0)))
+ # TODO: Fix the following - can we get a Request.id == 0???
+ request = trans.sa_session.query( trans.app.model.Request ).get(int(params.get('request_id', 0)))
current_samples, details, edit_mode = self.__update_samples( request, **kwd )
return trans.fill_template( '/requests/show_request.mako',
request=request,
@@ -379,7 +380,7 @@
edit_mode=edit_mode)
def __select_request_type(self, trans, rtid):
rt_ids = ['none']
- for rt in trans.app.model.RequestType.query().all():
+ for rt in trans.sa_session.query( trans.app.model.RequestType ):
if not rt.deleted:
rt_ids.append(str(rt.id))
select_reqtype = SelectField('select_request_type',
@@ -389,7 +390,7 @@
select_reqtype.add_option('Select one', 'none', selected=True)
else:
select_reqtype.add_option('Select one', 'none')
- for rt in trans.app.model.RequestType.query().all():
+ for rt in trans.sa_session.query( trans.app.model.RequestType ):
if not rt.deleted:
if rtid == rt.id:
select_reqtype.add_option(rt.name, rt.id, selected=True)
@@ -411,7 +412,7 @@
elif params.get('create', False) == 'True':
if params.get('create_request_button', False) == 'Save' \
or params.get('create_request_samples_button', False) == 'Add samples':
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
if not util.restore_text(params.get('name', '')):
msg = 'Please enter the <b>Name</b> of the request'
kwd['create'] = 'True'
@@ -448,7 +449,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
except:
return trans.fill_template( '/requests/new_request.mako',
select_request_type=self.__select_request_type(trans, 'none'),
@@ -496,8 +497,9 @@
lib_id = str(request.library.id)
selected_lib = request.library
# get all permitted libraries for this user
- all_libraries = trans.app.model.Library.filter( trans.app.model.Library.table.c.deleted == False ) \
- .order_by( trans.app.model.Library.name ).all()
+ all_libraries = trans.sa_session.query( trans.app.model.Library ) \
+ .filter( trans.app.model.Library.table.c.deleted == False ) \
+ .order_by( trans.app.model.Library.name )
user, roles = trans.get_user_and_roles()
actions_to_check = [ trans.app.security_agent.permitted_actions.LIBRARY_ADD ]
libraries = odict()
@@ -600,16 +602,16 @@
This method saves a new request if request_id is None.
'''
params = util.Params( kwd )
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
name = util.restore_text(params.get('name', ''))
desc = util.restore_text(params.get('desc', ''))
# library
try:
- library = trans.app.model.Library.get(int(params.get('library_id', None)))
+ library = trans.sa_session.query( trans.app.model.Library ).get( int( params.get( 'library_id', None ) ) )
except:
library = None
try:
- folder = trans.app.model.LibraryFolder.get(int(params.get('folder_id', None)))
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( int( params.get( 'folder_id', None ) ) )
except:
if library:
folder = library.root_folder
@@ -633,7 +635,7 @@
user_address.country = util.restore_text(params.get('field_%i_country' % index, ''))
user_address.phone = util.restore_text(params.get('field_%i_phone' % index, ''))
user_address.flush()
- trans.user.refresh()
+ trans.sa_session.refresh( trans.user )
values.append(int(user_address.id))
elif value == unicode('none'):
values.append('')
@@ -667,7 +669,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- request = trans.app.model.Request.get(int(params.get('request_id', None)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', None ) ) )
except:
return trans.response.send_redirect( web.url_for( controller='requests',
action='list',
@@ -678,7 +680,7 @@
return self.__edit_request(trans, request.id, **kwd)
elif params.get('save_changes_request_button', False) == 'Save changes' \
or params.get('edit_samples_button', False) == 'Edit samples':
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
if not util.restore_text(params.get('name', '')):
msg = 'Please enter the <b>Name</b> of the request'
kwd['messagetype'] = 'error'
@@ -708,7 +710,7 @@
def __edit_request(self, trans, id, **kwd):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -750,7 +752,7 @@
return self.__show_request_form(trans)
def __delete_request(self, trans, id):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -777,7 +779,7 @@
**kwd) )
def __undelete_request(self, trans, id):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -798,7 +800,7 @@
**kwd) )
def __submit(self, trans, id):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -837,7 +839,7 @@
params = util.Params( kwd )
try:
id = int(params.get('id', False))
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -876,7 +878,7 @@
params = util.Params( kwd )
try:
sample_id = int(params.get('sample_id', False))
- sample = trans.app.model.Sample.get(sample_id)
+ sample = trans.sa_session.query( trans.app.model.Sample ).get( sample_id )
except:
msg = "Invalid sample ID"
return trans.response.send_redirect( web.url_for( controller='requests',
@@ -897,7 +899,3 @@
events_list=events_list,
sample_name=sample.name,
request=sample.request)
-
-
-
-
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/requests_admin.py
--- a/lib/galaxy/web/controllers/requests_admin.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/requests_admin.py Thu Oct 22 23:02:28 2009 -0400
@@ -50,11 +50,11 @@
grids.GridColumnFilter( "All", args=dict( deleted=False ) )
]
def get_user(self, trans, request):
- return trans.app.model.User.get(request.user_id).email
+ return trans.sa_session.query( trans.app.model.User ).get( request.user_id ).email
def get_current_item( self, trans ):
return None
def get_request_type(self, trans, request):
- request_type = trans.app.model.RequestType.get(request.request_type_id)
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( request.request_type_id )
return request_type.name
def number_of_samples(self, trans, request):
return str(len(request.samples))
@@ -116,7 +116,7 @@
return self.request_grid( trans, **kwargs )
def __show_request(self, trans, id, messagetype, msg):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
return trans.response.send_redirect( web.url_for( controller='requests_admin',
action='list',
@@ -136,7 +136,7 @@
def __edit_request(self, trans, id, **kwd):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -178,7 +178,7 @@
return self.__show_request_form(trans)
def __delete_request(self, trans, id):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -206,7 +206,7 @@
**kwd) )
def __undelete_request(self, trans, id):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -228,7 +228,7 @@
**kwd) )
def __submit(self, trans, id):
try:
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -267,7 +267,7 @@
#
def __select_request_type(self, trans, rtid):
rt_ids = ['none']
- for rt in trans.app.model.RequestType.query().all():
+ for rt in trans.sa_session.query( trans.app.model.RequestType ):
if not rt.deleted:
rt_ids.append(str(rt.id))
select_reqtype = SelectField('select_request_type',
@@ -277,7 +277,7 @@
select_reqtype.add_option('Select one', 'none', selected=True)
else:
select_reqtype.add_option('Select one', 'none')
- for rt in trans.app.model.RequestType.query().all():
+ for rt in trans.sa_session.query( trans.app.model.RequestType ):
if not rt.deleted:
if rtid == rt.id:
select_reqtype.add_option(rt.name, rt.id, selected=True)
@@ -299,7 +299,7 @@
elif params.get('create', False) == 'True':
if params.get('create_request_button', False) == 'Save' \
or params.get('create_request_samples_button', False) == 'Add samples':
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
if not util.restore_text(params.get('name', '')) \
or util.restore_text(params.get('select_user', '')) == unicode('none'):
msg = 'Please enter the <b>Name</b> of the request and the <b>user</b> on behalf of whom this request will be submitted before saving this request'
@@ -338,7 +338,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
except:
return trans.fill_template( '/admin/requests/new_request.mako',
select_request_type=self.__select_request_type(trans, 'none'),
@@ -350,7 +350,7 @@
# user
user_id = params.get( 'select_user', 'none' )
try:
- user = trans.app.model.User.get(int(user_id))
+ user = trans.sa_session.query( trans.app.model.User ).get( int( user_id ) )
except:
user = None
# list of widgets to be rendered on the request form
@@ -378,7 +378,7 @@
messagetype=messagetype)
def __select_user(self, trans, userid):
user_ids = ['none']
- for user in trans.app.model.User.query().all():
+ for user in trans.sa_session.query( trans.app.model.User ):
if not user.deleted:
user_ids.append(str(user.id))
select_user = SelectField('select_user',
@@ -390,7 +390,7 @@
select_user.add_option('Select one', 'none')
def __get_email(user):
return user.email
- user_list = trans.app.model.User.query().all()
+ user_list = trans.sa_session.query( trans.app.model.User )
user_list.sort(key=__get_email)
for user in user_list:
if not user.deleted:
@@ -423,8 +423,9 @@
libraries = {}
else:
# get all permitted libraries for this user
- all_libraries = trans.app.model.Library.filter( trans.app.model.Library.table.c.deleted == False ) \
- .order_by( trans.app.model.Library.name ).all()
+ all_libraries = trans.sa_session.query( trans.app.model.Library ) \
+ .filter( trans.app.model.Library.table.c.deleted == False ) \
+ .order_by( trans.app.model.Library.name )
roles = user.all_roles()
actions_to_check = [ trans.app.security_agent.permitted_actions.LIBRARY_ADD ]
# The libraries dictionary looks like: { library : '1,2' }, library : '3' }
@@ -539,20 +540,20 @@
This method saves a new request if request_id is None.
'''
params = util.Params( kwd )
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
if request:
user = request.user
else:
- user = trans.app.model.User.get(int(params.get('select_user', '')))
+ user = trans.sa_session.query( trans.app.model.User ).get( int( params.get( 'select_user', '' ) ) )
name = util.restore_text(params.get('name', ''))
desc = util.restore_text(params.get('desc', ''))
# library
try:
- library = trans.app.model.Library.get(int(params.get('library_id', None)))
+ library = trans.sa_session.query( trans.app.model.Library ).get( int( params.get( 'library_id', None ) ) )
except:
library = None
try:
- folder = trans.app.model.LibraryFolder.get(int(params.get('folder_id', None)))
+ folder = trans.sa_session.query( trans.app.model.LibraryFolder ).get( int( params.get( 'folder_id', None ) ) )
except:
if library:
folder = library.root_folder
@@ -576,7 +577,7 @@
user_address.country = util.restore_text(params.get('field_%i_country' % index, ''))
user_address.phone = util.restore_text(params.get('field_%i_phone' % index, ''))
user_address.flush()
- trans.user.refresh()
+ trans.sa_session.refresh( trans.user )
values.append(int(user_address.id))
elif value == unicode('none'):
values.append('')
@@ -614,7 +615,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- request = trans.app.model.Request.get(int(params.get('request_id', None)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', None ) ) )
except:
return trans.response.send_redirect( web.url_for( controller='requests_admin',
action='list',
@@ -625,7 +626,7 @@
return self.__edit_request(trans, request.id, **kwd)
elif params.get('save_changes_request_button', False) == 'Save changes' \
or params.get('edit_samples_button', False) == 'Edit samples':
- request_type = trans.app.model.RequestType.get(int(params.select_request_type))
+ request_type = trans.sa_session.query( trans.app.model.RequestType ).get( int( params.select_request_type ) )
if not util.restore_text(params.get('name', '')):
msg = 'Please enter the <b>Name</b> of the request'
kwd['messagetype'] = 'error'
@@ -658,7 +659,7 @@
params = util.Params( kwd )
try:
id = int(params.get('id', False))
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
except:
msg = "Invalid request ID"
log.warn( msg )
@@ -727,7 +728,7 @@
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
try:
- request = trans.app.model.Request.get(int(params.get('request_id', None)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', None ) ) )
except:
return trans.response.send_redirect( web.url_for( controller='requests_admin',
action='list',
@@ -820,7 +821,7 @@
sample_values.append(util.restore_text( params.get( 'sample_%i_field_%i' % (sample_index, field_index), '' ) ))
sample = request.has_sample(sample_name)
if sample:
- form_values = trans.app.model.FormValues.get(sample.values.id)
+ form_values = trans.sa_session.query( trans.app.model.FormValues ).get( sample.values.id )
form_values.content = sample_values
form_values.flush()
sample.name = new_sample_name
@@ -851,13 +852,13 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- request = trans.app.model.Request.get(int(params.get('request_id', 0)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', 0 ) ) )
current_samples, details, edit_mode = self.__update_samples( request, **kwd )
sample_index = int(params.get('sample_id', 0))
sample_name = current_samples[sample_index][0]
s = request.has_sample(sample_name)
if s:
- s.delete()
+ trans.sa_session.delete( s )
s.flush()
request.flush()
del current_samples[sample_index]
@@ -875,7 +876,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- request = trans.app.model.Request.get(int(params.get('request_id', 0)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', 0 ) ) )
current_samples, details, edit_mode = self.__update_samples( request, **kwd )
return trans.fill_template( '/admin/requests/show_request.mako',
request=request,
@@ -890,7 +891,7 @@
'''
Shows the request details
'''
- request = trans.app.model.Request.get(id)
+ request = trans.sa_session.query( trans.app.model.Request ).get( id )
# list of widgets to be rendered on the request form
request_details = []
# main details
@@ -935,7 +936,7 @@
if field['type'] == 'AddressField':
if request.values.content[index]:
request_details.append(dict(label=field['label'],
- value=trans.app.model.UserAddress.get(int(request.values.content[index])).get_html(),
+ value=trans.sa_session.query( trans.app.model.UserAddress ).get( int( request.values.content[index] ) ).get_html(),
helptext=field['helptext']+' ('+req+')'))
else:
request_details.append(dict(label=field['label'],
@@ -954,7 +955,7 @@
messagetype = params.get( 'messagetype', 'done' )
request_id = params.get( 'request_id', None )
if request_id:
- request = trans.app.model.Request.get( int( request_id ))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( request_id ))
if not request:
return trans.response.send_redirect( web.url_for( controller='requests_admin',
action='list',
@@ -980,7 +981,7 @@
def save_bar_codes(self, trans, **kwd):
params = util.Params( kwd )
try:
- request = trans.app.model.Request.get(int(params.get('request_id', None)))
+ request = trans.sa_session.query( trans.app.model.Request ).get( int( params.get( 'request_id', None ) ) )
except:
return trans.response.send_redirect( web.url_for( controller='requests_admin',
action='list',
@@ -1008,7 +1009,7 @@
(bar_code, request.samples[index].name)
break
# check all the saved bar codes
- all_samples = trans.app.model.Sample.query.all()
+ all_samples = trans.sa_session.query( trans.app.model.Sample )
for sample in all_samples:
if bar_code == sample.bar_code:
msg = '''The bar code <b>%s</b> of sample <b>%s</b> already
@@ -1077,7 +1078,7 @@
params = util.Params( kwd )
try:
sample_id = int(params.get('sample_id', False))
- sample = trans.app.model.Sample.get(sample_id)
+ sample = trans.sa_session.query( trans.app.model.Sample ).get( sample_id )
except:
msg = "Invalid sample ID"
return trans.response.send_redirect( web.url_for( controller='requests_admin',
@@ -1087,8 +1088,10 @@
**kwd) )
comments = util.restore_text( params.comment )
selected_state = int( params.select_state )
- new_state = trans.app.model.SampleState.filter(trans.app.model.SampleState.table.c.request_type_id == sample.request.type.id
- and trans.app.model.SampleState.table.c.id == selected_state)[0]
+ new_state = trans.sa_session.query( trans.app.model.SampleState ) \
+ .filter( and_( trans.app.model.SampleState.table.c.request_type_id == sample.request.type.id,
+ trans.app.model.SampleState.table.c.id == selected_state ) ) \
+ .first()
event = trans.app.model.SampleEvent(sample, new_state, comments)
event.flush()
self.__set_request_state(sample.request)
@@ -1101,7 +1104,7 @@
params = util.Params( kwd )
try:
sample_id = int(params.get('sample_id', False))
- sample = trans.app.model.Sample.get(sample_id)
+ sample = trans.sa_session.query( trans.app.model.Sample ).get( sample_id )
except:
msg = "Invalid sample ID"
return trans.response.send_redirect( web.url_for( controller='requests_admin',
@@ -1134,7 +1137,7 @@
messagetype = params.get( 'messagetype', 'done' )
show_filter = util.restore_text( params.get( 'show_filter', 'Active' ) )
forms = get_all_forms(trans, all_versions=True)
- request_types_list = trans.app.model.RequestType.query().all()
+ request_types_list = trans.sa_session.query( trans.app.model.RequestType )
if show_filter == 'All':
request_types = request_types_list
elif show_filter == 'Deleted':
@@ -1184,8 +1187,8 @@
msg='Request type <b>%s</b> has been created' % st.name,
messagetype='done') )
elif params.get('view', False):
- rt = trans.app.model.RequestType.get(int(util.restore_text( params.id )))
- ss_list = trans.app.model.SampleState.filter(trans.app.model.SampleState.table.c.request_type_id == rt.id).all()
+ rt = trans.sa_session.query( trans.app.model.RequestType ).get( int( util.restore_text( params.id ) ) )
+ ss_list = trans.sa_session.query( trans.app.model.SampleState ).filter( trans.app.model.SampleState.table.c.request_type_id == rt.id )
return trans.fill_template( '/admin/requests/view_request_type.mako',
request_type=rt,
forms=get_all_forms( trans ),
@@ -1208,13 +1211,13 @@
rt = trans.app.model.RequestType()
rt.name = util.restore_text( params.name )
rt.desc = util.restore_text( params.description ) or ""
- rt.request_form = trans.app.model.FormDefinition.get(int( params.request_form_id ))
- rt.sample_form = trans.app.model.FormDefinition.get(int( params.sample_form_id ))
+ rt.request_form = trans.sa_session.query( trans.app.model.FormDefinition ).get( int( params.request_form_id ) )
+ rt.sample_form = trans.sa_session.query( trans.app.model.FormDefinition ).get( int( params.sample_form_id ) )
rt.flush()
# set sample states
- ss_list = trans.app.model.SampleState.filter(trans.app.model.SampleState.table.c.request_type_id == rt.id).all()
+ ss_list = trans.sa_session.query( trans.app.model.SampleState ).filter( trans.app.model.SampleState.table.c.request_type_id == rt.id )
for ss in ss_list:
- ss.delete()
+ trans.sa_session.delete( ss )
ss.flush()
for i in range( num_states ):
name = util.restore_text( params.get( 'state_name_%i' % i, None ))
@@ -1229,7 +1232,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- rt = trans.app.model.RequestType.get(int(util.restore_text( params.request_type_id )))
+ rt = trans.sa_session.query( trans.app.model.RequestType ).get( int( util.restore_text( params.request_type_id ) ) )
rt.deleted = True
rt.flush()
return trans.response.send_redirect( web.url_for( controller='requests_admin',
@@ -1242,7 +1245,7 @@
params = util.Params( kwd )
msg = util.restore_text( params.get( 'msg', '' ) )
messagetype = params.get( 'messagetype', 'done' )
- rt = trans.app.model.RequestType.get(int(util.restore_text( params.request_type_id )))
+ rt = trans.sa_session.query( trans.app.model.RequestType ).get( int( util.restore_text( params.request_type_id ) ) )
rt.deleted = False
rt.flush()
return trans.response.send_redirect( web.url_for( controller='requests_admin',
diff -r 20b319780138 -r 2c4ed83f76ef lib/galaxy/web/controllers/root.py
--- a/lib/galaxy/web/controllers/root.py Wed Oct 21 23:15:22 2009 -0400
+++ b/lib/galaxy/web/controllers/root.py Thu Oct 22 23:02:28 2009 -0400
@@ -85,7 +85,7 @@
def dataset_state ( self, trans, id=None, stamp=None ):
if id is not None:
try:
- data = self.app.model.HistoryDatasetAssociation.get( id )
+ data = trans.sa_session.query( self.app.model.HistoryDatasetAssociation ).get( id )
except:
return trans.show_error_message( "Unable to check dataset %s." %str( id ) )
trans.response.headers['X-Dataset-State'] = data.state
@@ -99,7 +99,7 @@
def dataset_code( self, trans, id=None, hid=None, stamp=None ):
if id is not None:
try:
- data = self.app.model.HistoryDatasetAssociation.get( id )
+ data = trans.sa_session.query( self.app.model.HistoryDatasetAssociation ).get( id )
except:
return trans.show_error_message( "Unable to check dataset %s." %str( id ) )
trans.response.headers['Pragma'] = 'no-cache'
@@ -119,7 +119,7 @@
ids = map( int, ids.split( "," ) )
states = states.split( "," )
for id, state in zip( ids, states ):
- data = self.app.model.HistoryDatasetAssociation.get( id )
+ data = trans.sa_session.query( self.app.model.HistoryDatasetAssociation ).get( id )
if data.state != state:
job_hda = data
while job_hda.copied_from_history_dataset_association:
1
0
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/6dcf496ad316
changeset: 2912:6dcf496ad316
user: Greg Von Kuster <greg(a)bx.psu.edu>
date: Fri Oct 23 13:31:11 2009 -0400
description:
Some missed fixes for sqlalchemy 0.5 form of querying.
6 file(s) affected in this change:
lib/galaxy/datatypes/metadata.py
lib/galaxy/jobs/__init__.py
lib/galaxy/jobs/runners/local.py
lib/galaxy/jobs/runners/pbs.py
lib/galaxy/tools/actions/upload_common.py
lib/galaxy/tools/parameters/basic.py
diffs (112 lines):
diff -r 4448bc8ba587 -r 6dcf496ad316 lib/galaxy/datatypes/metadata.py
--- a/lib/galaxy/datatypes/metadata.py Fri Oct 23 12:53:25 2009 -0400
+++ b/lib/galaxy/datatypes/metadata.py Fri Oct 23 13:31:11 2009 -0400
@@ -391,7 +391,9 @@
return value
if DATABASE_CONNECTION_AVAILABLE:
try:
- return galaxy.model.MetadataFile.get( value )
+ # FIXME: GVK ( 10/23/09 ) Can we get a valid db session without this import?
+ from galaxy.model.mapping import context as sa_session
+ return sa_session.query( galaxy.model.MetadataFile ).get( value )
except:
#value was not a valid id
return None
@@ -569,9 +571,9 @@
log.debug( 'setting metadata externally failed for %s %s: %s' % ( dataset.__class__.__name__, dataset.id, rstring ) )
return rval
- def cleanup_external_metadata( self ):
+ def cleanup_external_metadata( self, sa_session ):
log.debug( 'Cleaning up external metadata files' )
- for metadata_files in galaxy.model.Job.get( self.job_id ).external_output_metadata:
+ for metadata_files in sa_session.query( galaxy.model.Job ).get( self.job_id ).external_output_metadata:
#we need to confirm that any MetadataTempFile files were removed, if not we need to remove them
#can occur if the job was stopped before completion, but a MetadataTempFile is used in the set_meta
MetadataTempFile.cleanup_from_JSON_dict_filename( metadata_files.filename_out )
@@ -581,7 +583,7 @@
os.remove( fname )
except Exception, e:
log.debug( 'Failed to cleanup external metadata file (%s) for %s: %s' % ( key, dataset_key, e ) )
- def set_job_runner_external_pid( self, pid ):
- for metadata_files in galaxy.model.Job.get( self.job_id ).external_output_metadata:
+ def set_job_runner_external_pid( self, pid, sa_session ):
+ for metadata_files in sa_session.query( galaxy.model.Job ).get( self.job_id ).external_output_metadata:
metadata_files.job_runner_external_pid = pid
metadata_files.flush()
diff -r 4448bc8ba587 -r 6dcf496ad316 lib/galaxy/jobs/__init__.py
--- a/lib/galaxy/jobs/__init__.py Fri Oct 23 12:53:25 2009 -0400
+++ b/lib/galaxy/jobs/__init__.py Fri Oct 23 13:31:11 2009 -0400
@@ -609,7 +609,7 @@
if self.working_directory is not None:
shutil.rmtree( self.working_directory )
if self.app.config.set_metadata_externally:
- self.external_output_metadata.cleanup_external_metadata()
+ self.external_output_metadata.cleanup_external_metadata( self.sa_session )
except:
log.exception( "Unable to cleanup job %d" % self.job_id )
diff -r 4448bc8ba587 -r 6dcf496ad316 lib/galaxy/jobs/runners/local.py
--- a/lib/galaxy/jobs/runners/local.py Fri Oct 23 12:53:25 2009 -0400
+++ b/lib/galaxy/jobs/runners/local.py Fri Oct 23 13:31:11 2009 -0400
@@ -19,6 +19,7 @@
def __init__( self, app ):
"""Start the job runner with 'nworkers' worker threads"""
self.app = app
+ self.sa_session = app.model.context
self.queue = Queue()
self.threads = []
nworkers = app.config.local_job_queue_workers
@@ -111,7 +112,7 @@
shell = True,
env = env,
preexec_fn = os.setpgrp )
- job_wrapper.external_output_metadata.set_job_runner_external_pid( external_metadata_proc.pid )
+ job_wrapper.external_output_metadata.set_job_runner_external_pid( external_metadata_proc.pid, self.sa_session )
external_metadata_proc.wait()
log.debug( 'execution of external set_meta finished for job %d' % job_wrapper.job_id )
diff -r 4448bc8ba587 -r 6dcf496ad316 lib/galaxy/jobs/runners/pbs.py
--- a/lib/galaxy/jobs/runners/pbs.py Fri Oct 23 12:53:25 2009 -0400
+++ b/lib/galaxy/jobs/runners/pbs.py Fri Oct 23 13:31:11 2009 -0400
@@ -80,6 +80,7 @@
if app.config.pbs_application_server and app.config.outputs_to_working_directory:
raise Exception( "pbs_application_server (file staging) and outputs_to_working_directory options are mutually exclusive" )
self.app = app
+ self.sa_session = app.model.context
# 'watched' and 'queue' are both used to keep track of jobs to watch.
# 'queue' is used to add new watched jobs, and can be called from
# any thread (usually by the 'queue_job' method). 'watched' must only
@@ -457,7 +458,7 @@
"""
Seperated out so we can use the worker threads for it.
"""
- self.stop_job( self.app.model.Job.get( pbs_job_state.job_wrapper.job_id ) )
+ self.stop_job( self.sa_session.query( self.app.model.Job ).get( pbs_job_state.job_wrapper.job_id ) )
pbs_job_state.job_wrapper.fail( pbs_job_state.fail_message )
self.cleanup( ( pbs_job_state.ofile, pbs_job_state.efile, pbs_job_state.job_file ) )
diff -r 4448bc8ba587 -r 6dcf496ad316 lib/galaxy/tools/actions/upload_common.py
--- a/lib/galaxy/tools/actions/upload_common.py Fri Oct 23 12:53:25 2009 -0400
+++ b/lib/galaxy/tools/actions/upload_common.py Fri Oct 23 13:31:11 2009 -0400
@@ -71,7 +71,7 @@
user, roles = trans.get_user_and_roles()
for id in async_datasets:
try:
- data = data_obj.get( int( id ) )
+ data = trans.sa_session.query( data_obj ).get( int( id ) )
except:
log.exception( 'Unable to load precreated dataset (%s) sent in upload form' % id )
continue
diff -r 4448bc8ba587 -r 6dcf496ad316 lib/galaxy/tools/parameters/basic.py
--- a/lib/galaxy/tools/parameters/basic.py Fri Oct 23 12:53:25 2009 -0400
+++ b/lib/galaxy/tools/parameters/basic.py Fri Oct 23 13:31:11 2009 -0400
@@ -1235,7 +1235,7 @@
if value in [None, "None"]:
return None
if isinstance( value, list ):
- return [ trans.app.model.HistoryDatasetAssociation.get( v ) for v in value ]
+ return [ trans.sa_session.query( trans.app.model.HistoryDatasetAssociation ).get( v ) for v in value ]
elif isinstance( value, trans.app.model.HistoryDatasetAssociation ):
return value
else:
1
0
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/0dd49ce8eb3e
changeset: 2907:0dd49ce8eb3e
user: Kelly Vincent <kpvincent(a)bx.psu.edu>
date: Wed Oct 21 16:15:27 2009 -0400
description:
Added a paper citation to the Bowtie wrapper tool conf file
1 file(s) affected in this change:
tools/sr_mapping/bowtie_wrapper.xml
diffs (12 lines):
diff -r e6d01a8256bc -r 0dd49ce8eb3e tools/sr_mapping/bowtie_wrapper.xml
--- a/tools/sr_mapping/bowtie_wrapper.xml Wed Oct 21 16:04:24 2009 -0400
+++ b/tools/sr_mapping/bowtie_wrapper.xml Wed Oct 21 16:15:27 2009 -0400
@@ -409,7 +409,7 @@
**What it does**
-Bowtie_ is a short read aligner designed to be ultrafast and memory-efficient. It is developed by Ben Langmead and Cole Trapnell.
+Bowtie_ is a short read aligner designed to be ultrafast and memory-efficient. It is developed by Ben Langmead and Cole Trapnell. Please cite: Langmead B, Trapnell C, Pop M, Salzberg SL. Ultrafast and memory-efficient alignment of short DNA sequences to the human genome. Genome Biology 10:R25.
.. _Bowtie: http://bowtie-bio.sourceforge.net/index.shtml
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/20b319780138
changeset: 2908:20b319780138
user: Kanwei Li <kanwei(a)gmail.com>
date: Wed Oct 21 23:15:22 2009 -0400
description:
fix dataset link that broke ucsc links
4 file(s) affected in this change:
lib/galaxy/web/buildapp.py
lib/galaxy/web/controllers/dataset.py
templates/dataset/large_file.mako
templates/root/history_common.mako
diffs (68 lines):
diff -r 0dd49ce8eb3e -r 20b319780138 lib/galaxy/web/buildapp.py
--- a/lib/galaxy/web/buildapp.py Wed Oct 21 16:15:27 2009 -0400
+++ b/lib/galaxy/web/buildapp.py Wed Oct 21 23:15:22 2009 -0400
@@ -77,7 +77,7 @@
webapp.add_route( '/async/:tool_id/:data_id/:data_secret', controller='async', action='index', tool_id=None, data_id=None, data_secret=None )
webapp.add_route( '/:controller/:action', action='index' )
webapp.add_route( '/:action', controller='root', action='index' )
- webapp.add_route( '/datasets/:encoded_id/:action/:filename', controller='dataset', action='index', encoded_id=None, filename=None)
+ webapp.add_route( '/datasets/:dataset_id/:action/:filename', controller='dataset', action='index', dataset_id=None, filename=None)
webapp.add_route( '/u/:username/p/:slug', controller='page', action='display_by_username_and_slug' )
webapp.finalize_config()
# Wrap the webapp in some useful middleware
diff -r 0dd49ce8eb3e -r 20b319780138 lib/galaxy/web/controllers/dataset.py
--- a/lib/galaxy/web/controllers/dataset.py Wed Oct 21 16:15:27 2009 -0400
+++ b/lib/galaxy/web/controllers/dataset.py Wed Oct 21 23:15:22 2009 -0400
@@ -199,14 +199,14 @@
return 'This link may not be followed from within Galaxy.'
@web.expose
- def display(self, trans, encoded_id=None, preview=False, filename=None, to_ext=None, **kwd):
+ def display(self, trans, dataset_id=None, preview=False, filename=None, to_ext=None, **kwd):
"""Catches the dataset id and displays file contents as directed"""
# DEPRECATION: We still support unencoded ids for backward compatibility
try:
- dataset_id = int( encoded_id )
+ dataset_id = int( dataset_id )
except ValueError:
- dataset_id = trans.security.decode_id( encoded_id )
+ dataset_id = trans.security.decode_id( dataset_id )
data = trans.app.model.HistoryDatasetAssociation.get( dataset_id )
if not data:
raise paste.httpexceptions.HTTPRequestRangeNotSatisfiable( "Invalid reference dataset id: %s." % str( dataset_id ) )
diff -r 0dd49ce8eb3e -r 20b319780138 templates/dataset/large_file.mako
--- a/templates/dataset/large_file.mako Wed Oct 21 16:15:27 2009 -0400
+++ b/templates/dataset/large_file.mako Wed Oct 21 23:15:22 2009 -0400
@@ -2,8 +2,8 @@
<div class="warningmessagelarge">
This dataset is large and only the first megabyte is shown below.<br />
- <a href="${h.url_for( controller='dataset', action='display', encoded_id=trans.security.encode_id( data.id ), filename='' )}">Show all</a> |
- <a href="${h.url_for( controller='dataset', action='display', encoded_id=trans.security.encode_id( data.id ), to_ext=data.ext )}">Save</a>
+ <a href="${h.url_for( controller='dataset', action='display', dataset_id=trans.security.encode_id( data.id ), filename='' )}">Show all</a> |
+ <a href="${h.url_for( controller='dataset', action='display', dataset_id=trans.security.encode_id( data.id ), to_ext=data.ext )}">Save</a>
</div>
<pre>
diff -r 0dd49ce8eb3e -r 20b319780138 templates/root/history_common.mako
--- a/templates/root/history_common.mako Wed Oct 21 16:15:27 2009 -0400
+++ b/templates/root/history_common.mako Wed Oct 21 23:15:22 2009 -0400
@@ -33,7 +33,7 @@
<img src="${h.url_for('/static/images/pencil_icon_grey.png')}" width='16' height='16' alt='edit attributes' title='edit attributes' class='button edit' border='0'>
%endif
%else:
- <a class="icon-button display" title="display data" href="${h.url_for( controller='dataset', action='display', encoded_id=trans.security.encode_id( data.id ), preview=True, filename='' )}" target="galaxy_main"></a>
+ <a class="icon-button display" title="display data" href="${h.url_for( controller='dataset', action='display', dataset_id=trans.security.encode_id( data.id ), preview=True, filename='' )}" target="galaxy_main"></a>
%if user_owns_dataset:
<a class="icon-button edit" title="edit attributes" href="${h.url_for( controller='root', action='edit', id=data.id )}" target="galaxy_main"></a>
%endif
@@ -87,7 +87,7 @@
<div class="info">${_('Info: ')}${data.display_info()}</div>
<div>
%if data.has_data:
- <a href="${h.url_for( controller='dataset', action='display', encoded_id=trans.security.encode_id( data.id ), to_ext=data.ext )}">save</a>
+ <a href="${h.url_for( controller='dataset', action='display', dataset_id=trans.security.encode_id( data.id ), to_ext=data.ext )}">save</a>
%if user_owns_dataset:
| <a href="${h.url_for( controller='tool_runner', action='rerun', id=data.id )}" target="galaxy_main">rerun</a>
%endif
1
0
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/8a619f8e2fb6
changeset: 2905:8a619f8e2fb6
user: James Taylor <james(a)jamestaylor.org>
date: Wed Oct 21 15:45:01 2009 -0400
description:
Fix for tool tests. Comment out get_peek test, doesn't appear correct (result is unicode and forced to multiple lines, when did this change?)
3 file(s) affected in this change:
lib/galaxy/datatypes/data.py
scripts/functional_tests.py
test/functional/test_toolbox.py
diffs (64 lines):
diff -r 597f813860c1 -r 8a619f8e2fb6 lib/galaxy/datatypes/data.py
--- a/lib/galaxy/datatypes/data.py Wed Oct 21 15:03:09 2009 -0400
+++ b/lib/galaxy/datatypes/data.py Wed Oct 21 15:45:01 2009 -0400
@@ -35,7 +35,7 @@
'test'
>>> DataTest.metadata_spec.test.desc
'test'
- >>> DataTest.metadata_spec.test.param
+ >>> type( DataTest.metadata_spec.test.param )
<class 'galaxy.datatypes.metadata.MetadataParameter'>
"""
@@ -411,9 +411,9 @@
"""
Returns the first LINE_COUNT lines wrapped to WIDTH
- >>> fname = get_test_fname('4.bed')
- >>> get_file_peek(fname)
- 'chr22 30128507 31828507 uc003bnx.1_cds_2_0_chr22_29227_f 0 +\n'
+ ## >>> fname = get_test_fname('4.bed')
+ ## >>> get_file_peek(fname)
+ ## 'chr22 30128507 31828507 uc003bnx.1_cds_2_0_chr22_29227_f 0 +\n'
"""
lines = []
count = 0
diff -r 597f813860c1 -r 8a619f8e2fb6 scripts/functional_tests.py
--- a/scripts/functional_tests.py Wed Oct 21 15:03:09 2009 -0400
+++ b/scripts/functional_tests.py Wed Oct 21 15:45:01 2009 -0400
@@ -144,6 +144,7 @@
if app:
# TODO: provisions for loading toolbox from file when using external server
functional.test_toolbox.toolbox = app.toolbox
+ functional.test_toolbox.build_tests()
else:
# FIXME: This doesn't work at all now that toolbox requires an 'app' instance
# (to get at datatypes, might just pass a datatype registry directly)
diff -r 597f813860c1 -r 8a619f8e2fb6 test/functional/test_toolbox.py
--- a/test/functional/test_toolbox.py Wed Oct 21 15:03:09 2009 -0400
+++ b/test/functional/test_toolbox.py Wed Oct 21 15:45:01 2009 -0400
@@ -100,7 +100,7 @@
expanded_inputs[value.name] = declared_inputs[value.name]
return expanded_inputs
-def get_testcase( testdef, name ):
+def get_case( testdef, name ):
"""Dynamically generate a `ToolTestCase` for `testdef`"""
n = "TestForTool_" + testdef.tool.id.replace( ' ', '_' )
s = ( ToolTestCase, )
@@ -109,7 +109,7 @@
d = dict( testdef=testdef, test_tool=test_tool, name=name )
return new.classobj( n, s, d )
-def setup():
+def build_tests():
"""
If the module level variable `toolbox` is set, generate `ToolTestCase`
classes for all of its tests and put them into this modules globals() so
@@ -124,5 +124,5 @@
if tool.tests:
for j, testdef in enumerate( tool.tests ):
name = "%s ( %s ) > %s" % ( tool.name, tool.id, testdef.name )
- testcase = get_testcase( testdef, name )
+ testcase = get_case( testdef, name )
G[ 'testcase_%d_%d' % ( i, j ) ] = testcase
1
0
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/e6d01a8256bc
changeset: 2906:e6d01a8256bc
user: Kelly Vincent <kpvincent(a)bx.psu.edu>
date: Wed Oct 21 16:04:24 2009 -0400
description:
Updated solid_to_fastq so that -1 qualities are converted to 1 before conversion to FASTQ.
3 file(s) affected in this change:
tools/next_gen_conversion/bwa_solid2fastq_modified.pl
tools/next_gen_conversion/solid_to_fastq.py
tools/next_gen_conversion/solid_to_fastq.xml
diffs (78 lines):
diff -r 8a619f8e2fb6 -r e6d01a8256bc tools/next_gen_conversion/bwa_solid2fastq_modified.pl
--- a/tools/next_gen_conversion/bwa_solid2fastq_modified.pl Wed Oct 21 15:45:01 2009 -0400
+++ b/tools/next_gen_conversion/bwa_solid2fastq_modified.pl Wed Oct 21 16:04:24 2009 -0400
@@ -73,7 +73,7 @@
if (/^>(\d+)_(\d+)_(\d+)_[FR]3/) {
$key = sprintf("%.4d_%.4d_%.4d", $1, $2, $3); # this line could be improved on 64-bit machines
#print $key;
- die(qq/** unmatched read name: '$_' != '$_'\n/) unless ($_ eq $t);
+ die(qq/** unmatched read name: '$_' != '$t'\n/) unless ($_ eq $t);
my $name = "$1_$2_$3/$i";
$_ = substr(<$fhs>, 2);
tr/0123./ACGTN/;
diff -r 8a619f8e2fb6 -r e6d01a8256bc tools/next_gen_conversion/solid_to_fastq.py
--- a/tools/next_gen_conversion/solid_to_fastq.py Wed Oct 21 15:45:01 2009 -0400
+++ b/tools/next_gen_conversion/solid_to_fastq.py Wed Oct 21 16:04:24 2009 -0400
@@ -22,15 +22,29 @@
def stop_err( msg ):
sys.stderr.write( "%s\n" % msg )
sys.exit()
+
+def replaceNeg1(fin, fout):
+ line = fin.readline()
+ while line.strip():
+ fout.write(line.replace('-1', '1'))
+ line = fin.readline()
+ fout.seek(0)
+ return fout
def __main__():
#Parse Command Line
options, args = doc_optparse.parse( __doc__ )
+ # common temp file setup
+ tmpf = tempfile.NamedTemporaryFile() #forward reads
+ tmpqf = tempfile.NamedTemporaryFile()
+ tmpqf = replaceNeg1(file(options.input2,'r'), tmpqf)
# if paired-end data (have reverse input files)
- tmpf = tempfile.NamedTemporaryFile() #forward reads
if options.input3 != "None" and options.input4 != "None":
tmpr = tempfile.NamedTemporaryFile() #reverse reads
- cmd1 = "%s/bwa_solid2fastq_modified.pl 'yes' %s %s %s %s %s %s 2>&1" %(os.path.split(sys.argv[0])[0], tmpf.name,tmpr.name,options.input1,options.input2,options.input3,options.input4)
+ # replace the -1 in the qualities file
+ tmpqr = tempfile.NamedTemporaryFile()
+ tmpqr = replaceNeg1(file(options.input4,'r'), tmpqr)
+ cmd1 = "%s/bwa_solid2fastq_modified.pl 'yes' %s %s %s %s %s %s 2>&1" %(os.path.split(sys.argv[0])[0], tmpf.name, tmpr.name, options.input1, tmpqf.name, options.input3, tmpqr.name)
try:
os.system(cmd1)
os.system('gunzip -c %s >> %s' %(tmpf.name,options.output1))
@@ -38,14 +52,17 @@
except Exception, eq:
stop_err("Error converting data to fastq format.\n" + str(eq))
tmpr.close()
+ tmpqr.close()
# if single-end data
else:
- cmd1 = "%s/bwa_solid2fastq_modified.pl 'no' %s %s %s %s %s %s 2>&1" % (os.path.split(sys.argv[0])[0], tmpf.name, None, options.input1, options.input2, None, None)
+ cmd1 = "%s/bwa_solid2fastq_modified.pl 'no' %s %s %s %s %s %s 2>&1" % (os.path.split(sys.argv[0])[0], tmpf.name, None, options.input1, tmpqf.name, None, None)
try:
os.system(cmd1)
os.system('gunzip -c %s >> %s' % (tmpf.name, options.output1))
except Exception, eq:
stop_err("Error converting data to fastq format.\n" + str(eq))
+ tmpqf.close()
tmpf.close()
+ sys.stdout.write('converted SOLiD data')
if __name__=="__main__": __main__()
diff -r 8a619f8e2fb6 -r e6d01a8256bc tools/next_gen_conversion/solid_to_fastq.xml
--- a/tools/next_gen_conversion/solid_to_fastq.xml Wed Oct 21 15:45:01 2009 -0400
+++ b/tools/next_gen_conversion/solid_to_fastq.xml Wed Oct 21 16:04:24 2009 -0400
@@ -66,7 +66,7 @@
**What it does**
-This tool takes reads and quality files and converts them to FASTQ data ( Sanger variant ). Note that it also converts sequences to base pairs.
+This tool takes reads and quality files and converts them to FASTQ data ( Sanger variant ). Any -1 qualities are converted to 1 before being converted to FASTQ. Note that it also converts sequences to base pairs.
-----
1
0
details: http://www.bx.psu.edu/hg/galaxy/rev/597f813860c1
changeset: 2904:597f813860c1
user: Nate Coraor <nate(a)bx.psu.edu>
date: Wed Oct 21 15:03:09 2009 -0400
description:
Don't unpack dist-eggs
2 file(s) affected in this change:
lib/galaxy/eggs/__init__.py
scripts/dist-scramble.py
diffs (34 lines):
diff -r fd5943ff7b40 -r 597f813860c1 lib/galaxy/eggs/__init__.py
--- a/lib/galaxy/eggs/__init__.py Tue Oct 20 13:08:29 2009 -0400
+++ b/lib/galaxy/eggs/__init__.py Wed Oct 21 15:03:09 2009 -0400
@@ -112,7 +112,7 @@
unpack_zipfile( self.path, self.path + "-tmp" )
os.remove( self.path )
os.rename( self.path + "-tmp", self.path )
- def scramble( self ):
+ def scramble( self, dist=False ):
if self.path is None:
self.find()
if os.access( self.path, os.F_OK ):
@@ -148,7 +148,8 @@
shutil.copyfile( new_egg, self.path )
log.warning( "scramble(): Copied egg to:" )
log.warning( " %s" % self.path )
- self.unpack_if_needed()
+ if not dist:
+ self.unpack_if_needed()
for doppelganger in self.doppelgangers:
remove_file_or_path( doppelganger )
log.warning( "Removed conflicting egg: %s" % doppelganger )
diff -r fd5943ff7b40 -r 597f813860c1 scripts/dist-scramble.py
--- a/scripts/dist-scramble.py Tue Oct 20 13:08:29 2009 -0400
+++ b/scripts/dist-scramble.py Wed Oct 21 15:03:09 2009 -0400
@@ -31,7 +31,7 @@
sys.exit( 1 )
failed = []
for egg in egg_list:
- if not egg.scramble():
+ if not egg.scramble( dist=True ):
failed.append( egg.platform['galaxy'] )
if len( failed ):
print ""
1
0
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/52fadf5aabd9
changeset: 2896:52fadf5aabd9
user: Kanwei Li <kanwei(a)gmail.com>
date: Tue Oct 20 16:46:12 2009 -0400
description:
trackster: caching goes by zoom levels, greatly improving zooming performance. revamped UI
4 file(s) affected in this change:
static/scripts/packed/trackster.js
static/scripts/trackster.js
static/trackster.css
templates/tracks/browser.mako
diffs (439 lines):
diff -r 61684a4d6b9e -r 52fadf5aabd9 static/scripts/packed/trackster.js
--- a/static/scripts/packed/trackster.js Tue Oct 20 15:34:47 2009 -0400
+++ b/static/scripts/packed/trackster.js Tue Oct 20 16:46:12 2009 -0400
@@ -1,1 +1,1 @@
-var DENSITY=1000,DATA_ERROR="There was an error in indexing this dataset.",DATA_NONE="No data for this chrom/contig.",CACHED_TILES=200,CACHED_DATA=20;var View=function(a,b){this.chrom=a;this.tracks=[];this.max_low=0;this.max_high=b;this.low=this.max_low;this.high=this.max_high;this.length=this.max_high-this.max_low};$.extend(View.prototype,{add_track:function(a){a.view=this;this.tracks.push(a);if(a.init){a.init()}},redraw:function(){$("#overview-box").css({left:(this.low/this.length)*$("#overview-viewport").width(),width:Math.max(4,((this.high-this.low)/this.length)*$("#overview-viewport").width())}).show();$("#low").text(this.low);$("#high").text(this.high);for(var b=0,a=this.tracks.length;b<a;b++){this.tracks[b].draw()}$("#bottom-spacer").remove();$("#viewport").append('<div id="bottom-spacer" style="height: 200px;"></div>')},move:function(b,a){this.low=Math.max(this.max_low,Math.floor(b));this.high=Math.min(this.length,Math.ceil(a))},zoom_in:function(d,b){if(this.max_high
===0){return}var c=this.high-this.low;var e=c/d/2;var a;if(b===undefined){a=(this.low+this.high)/2}else{a=this.low+c*b/$(document).width()}this.low=Math.floor(a-e);this.high=Math.ceil(a+e);if(this.low<this.max_low){this.low=this.max_low;this.high=c/d}else{if(this.high>this.max_high){this.high=this.max_high;this.low=this.max_high-c/d}}if(this.high-this.low<1){this.high=this.low+1}},zoom_out:function(c){if(this.max_high===0){return}var a=(this.low+this.high)/2;var b=this.high-this.low;var d=b*c/2;this.low=Math.floor(Math.max(0,a-d));this.high=Math.ceil(Math.min(this.length,a+d))},left:function(b){var a=this.high-this.low;var c=Math.floor(a/b);if(this.low-c<0){this.low=0;this.high=this.low+a}else{this.low-=c;this.high-=c}},right:function(b){var a=this.high-this.low;var c=Math.floor(a/b);if(this.high+c>this.length){this.high=this.length;this.low=this.high-a}else{this.low+=c;this.high+=c}}});var Track=function(a,b){this.name=a;this.parent_element=b;this.make_container()};$.extend
(Track.prototype,{make_container:function(){this.header_div=$("<div class='track-header'>").text(this.name);this.content_div=$("<div class='track-content'>");this.container_div=$("<div class='track'></div>").append(this.header_div).append(this.content_div);this.parent_element.append(this.container_div)}});var TiledTrack=function(){this.tile_cache=new Cache(CACHED_TILES)};$.extend(TiledTrack.prototype,Track.prototype,{draw:function(){var i=this.view.low,c=this.view.high,e=c-i;var b=Math.pow(10,Math.ceil(Math.log(e/DENSITY)/Math.log(10)));b=Math.max(b,1);b=Math.min(b,100000);var l=$("<div style='position: relative;'></div>");this.content_div.children(":first").remove();this.content_div.append(l);var j=this.content_div.width(),d=this.content_div.height(),m=j/e;var g;var a=Math.floor(i/b/DENSITY);while((a*DENSITY*b)<c){var k=m+"_"+a;if(this.tile_cache[k]){g=this.tile_cache[k];var f=a*DENSITY*b;g.css({left:(f-this.view.low)*m});l.append(g)}else{g=this.draw_tile(b,a,l,m,d)}if(g){t
his.tile_cache[k]=g}a+=1}}});var LabelTrack=function(a){Track.call(this,null,a);this.container_div.addClass("label-track")};$.extend(LabelTrack.prototype,Track.prototype,{draw:function(){var c=this.view,d=c.high-c.low,g=Math.floor(Math.pow(10,Math.floor(Math.log(d)/Math.log(10)))),a=Math.floor(c.low/g)*g,e=this.content_div.width(),b=$("<div style='position: relative; height: 1.3em;'></div>");while(a<c.high){var f=(a-c.low)/d*e;b.append($("<div class='label'>"+a+"</div>").css({position:"absolute",left:f-1}));a+=g}this.content_div.children(":first").remove();this.content_div.append(b)}});var LineTrack=function(c,b,a){Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.track_type="line";this.height_px=(a?a:100);this.container_div.addClass("line-track");this.dataset_id=b;this.cache=new Cache(CACHED_DATA)};$.extend(LineTrack.prototype,TiledTrack.prototype,{init:function(){var a=this;$.getJSON(data_url,{stats:true,track_type:a.track_type,chrom:a.view.chrom,low:null,high:n
ull,dataset_id:a.dataset_id},function(b){if(!b||b=="error"){a.content_div.addClass("error").text(DATA_ERROR)}else{if(b=="no data"){a.content_div.addClass("nodata").text(DATA_NONE)}else{a.content_div.css("height",a.height_px+"px");a.min_value=b.min;a.max_value=b.max;a.vertical_range=a.max_value-a.min_value;a.view.redraw()}}})},get_data:function(d,b){var c=this,a=b*DENSITY*d,f=(b+1)*DENSITY*d,e=d+"_"+b;$.getJSON(data_url,{track_type:this.track_type,chrom:this.view.chrom,low:a,high:f,dataset_id:this.dataset_id},function(g){c.cache[e]=g;$(document).trigger("redraw")})},draw_tile:function(d,a,m,p,n){if(!this.vertical_range){return}var h=a*DENSITY*d,b=DENSITY*d,c=$("<canvas class='tile'></canvas>"),l=d+"_"+a;if(!this.cache[l]){this.get_data(d,a);return}var g=this.cache[l];c.css({position:"absolute",top:0,left:(h-this.view.low)*p});c.get(0).width=Math.ceil(b*p);c.get(0).height=this.height_px;var o=c.get(0).getContext("2d");var e=false;o.beginPath();for(var f=0;f<g.length-1;f++){var
k=g[f][0]-h;var j=g[f][1];if(isNaN(j)){e=false}else{k=k*p;j=(j-this.min_value)/this.vertical_range*this.height_px;if(e){o.lineTo(k,j)}else{o.moveTo(k,j);e=true}}}o.stroke();m.append(c);return c}});var FeatureTrack=function(c,b,a){Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.track_type="feature";this.height_px=(a?a:100);this.container_div.addClass("feature-track");this.dataset_id=b;this.zo_slots={};this.show_labels_scale=0.001;this.showing_labels=false;this.vertical_gap=10};$.extend(FeatureTrack.prototype,TiledTrack.prototype,{init:function(){var a=this;$.getJSON(data_url,{track_type:a.track_type,low:a.view.max_low,high:a.view.max_high,dataset_id:a.dataset_id,chrom:a.view.chrom},function(b){if(b=="error"){a.content_div.addClass("error").text(DATA_ERROR)}else{if(b.length===0||b=="no data"){a.content_div.addClass("nodata").text(DATA_NONE)}else{a.content_div.css("height",a.height_px+"px");a.values=b;a.calc_slots();a.slots=a.zo_slots;a.draw()}}})},calc_slots:func
tion(o){var c=[],b=this.container_div.width()/(this.view.high-this.view.low),g=this.show_labels_scale,a=this.view.max_high,e=this.view.max_low;if(o){this.zi_slots={}}var m=$("<canvas></canvas>").get(0).getContext("2d");for(var f=0,h=this.values.length;f<h;f++){var k,l,n=this.values[f];if(o){k=Math.floor(Math.max(e,(n.start-e)*g));k-=m.measureText(n.name).width;l=Math.ceil(Math.min(a,(n.end-e)*g))}else{k=Math.floor(Math.max(e,(n.start-e)*b));l=Math.ceil(Math.min(a,(n.end-e)*b))}var d=0;while(true){if(c[d]===undefined||c[d]<k){c[d]=l;if(o){this.zi_slots[n.name]=d}else{this.zo_slots[n.name]=d}break}d++}}this.height_px=c.length*this.vertical_gap+15;this.content_div.css("height",this.height_px+"px")},draw_tile:function(u,z,l,p,n){if(!this.values){return null}if(p>this.show_labels_scale&&!this.showing_labels){this.showing_labels=true;if(!this.zi_slots){this.calc_slots(true)}this.slots=this.zi_slots}else{if(p<=this.show_labels_scale&&this.showing_labels){this.showing_labels=false;t
his.slots=this.zo_slots}}var A=z*DENSITY*u,c=(z+1)*DENSITY*u,b=DENSITY*u;var s=Math.ceil(b*p),r=this.height_px,q=$("<canvas class='tile'></canvas>");q.css({position:"absolute",top:0,left:(A-this.view.low)*p});q.get(0).width=s;q.get(0).height=r;var t=q.get(0).getContext("2d");t.fillStyle="#000";t.font="10px monospace";t.textAlign="right";var w=0;for(var x=0,y=this.values.length;x<y;x++){var h=this.values[x];if(h.start<=c&&h.end>=A){var f=Math.floor(Math.max(0,(h.start-A)*p)),g=Math.ceil(Math.min(s,(h.end-A)*p)),e=this.slots[h.name]*this.vertical_gap;t.fillRect(f,e+5,g-f,1);if(this.showing_labels&&t.fillText){t.fillText(h.name,f,e+8)}var d,E;if(h.exon_start&&h.exon_end){d=Math.floor(Math.max(0,(h.exon_start-A)*p));E=Math.ceil(Math.min(s,(h.exon_end-A)*p));t.fillRect(d,e+4,E-d,3)}if(h.blocks){for(var v=0,C=h.blocks.length;v<C;v++){var o=h.blocks[v],m=Math.floor(Math.max(0,(o[0]-A)*p)),B=Math.ceil(Math.min(s,(o[1]-A)*p));var a=3,D=4;if(d&&m>=d&&B<=E){a=5;D=3}t.fillRect(m,e+D,B-m
,a)}}w++}}l.append(q);return q}});
\ No newline at end of file
+var DENSITY=1000,DATA_ERROR="There was an error in indexing this dataset.",DATA_NONE="No data for this chrom/contig.",CACHED_TILES=50,CACHED_DATA=20;function commatize(b){b+="";var a=/(\d+)(\d{3})/;while(a.test(b)){b=b.replace(a,"$1,$2")}return b}var View=function(b,a){this.chrom=b;this.tracks=[];this.max_low=0;this.max_high=a;this.center=(this.max_high-this.max_low)/2;this.span=this.max_high-this.max_low;this.zoom_factor=2;this.zoom_level=0};$.extend(View.prototype,{add_track:function(a){a.view=this;this.tracks.push(a);if(a.init){a.init()}},redraw:function(){var d=this.span/Math.pow(this.zoom_factor,this.zoom_level),b=this.center-(d/2),e=b+d;if(b<0){b=0;e=b+d}else{if(e>this.max_high){e=this.max_high;b=e-d}}this.low=Math.floor(b);this.high=Math.ceil(e);this.center=Math.round(this.low+(this.high-this.low)/2);$("#overview-box").css({left:(this.low/this.span)*$("#overview-viewport").width(),width:Math.max(12,((this.high-this.low)/this.span)*$("#overview-viewport").width())}).sh
ow();$("#low").text(commatize(this.low));$("#high").text(commatize(this.high));for(var c=0,a=this.tracks.length;c<a;c++){this.tracks[c].draw()}$("#bottom-spacer").remove();$("#viewport").append('<div id="bottom-spacer" style="height: 200px;"></div>')},zoom_in:function(a){if(this.max_high===0||this.high-this.low<5){return}if(a){this.center=a/$(document).width()*(this.high-this.low)+this.low}this.zoom_level+=1},zoom_out:function(){if(this.max_high===0){return}if(this.zoom_level<=0){this.zoom_level=0;return}this.zoom_level-=1}});var Track=function(a,b){this.name=a;this.parent_element=b;this.make_container()};$.extend(Track.prototype,{make_container:function(){this.header_div=$("<div class='track-header'>").text(this.name);this.content_div=$("<div class='track-content'>");this.container_div=$("<div class='track'></div>").append(this.header_div).append(this.content_div);this.parent_element.append(this.container_div)}});var TiledTrack=function(){this.tile_cache=new Cache(CACHED_TI
LES)};$.extend(TiledTrack.prototype,Track.prototype,{draw:function(){var i=this.view.low,c=this.view.high,e=c-i;var b=Math.pow(10,Math.ceil(Math.log(e/DENSITY)/Math.log(10)));b=Math.max(b,1);b=Math.min(b,100000);var l=$("<div style='position: relative;'></div>");this.content_div.children(":first").remove();this.content_div.append(l);var j=this.content_div.width(),d=this.content_div.height(),m=j/e;var g;var a=Math.floor(i/b/DENSITY);while((a*DENSITY*b)<c){var k=this.view.zoom_level+"_"+a;if(this.tile_cache[k]){g=this.tile_cache[k];var f=a*DENSITY*b;g.css({left:(f-this.view.low)*m});l.append(g)}else{g=this.draw_tile(b,a,l,m,d)}if(g){this.tile_cache[k]=g}a+=1}}});var LabelTrack=function(a){Track.call(this,null,a);this.container_div.addClass("label-track")};$.extend(LabelTrack.prototype,Track.prototype,{draw:function(){var c=this.view,d=c.high-c.low,g=Math.floor(Math.pow(10,Math.floor(Math.log(d)/Math.log(10)))),a=Math.floor(c.low/g)*g,e=this.content_div.width(),b=$("<div style=
'position: relative; height: 1.3em;'></div>");while(a<c.high){var f=(a-c.low)/d*e;b.append($("<div class='label'>"+commatize(a)+"</div>").css({position:"absolute",left:f-1}));a+=g}this.content_div.children(":first").remove();this.content_div.append(b)}});var LineTrack=function(c,b,a){Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.track_type="line";this.height_px=(a?a:100);this.container_div.addClass("line-track");this.dataset_id=b;this.cache=new Cache(CACHED_DATA)};$.extend(LineTrack.prototype,TiledTrack.prototype,{init:function(){var a=this;$.getJSON(data_url,{stats:true,track_type:a.track_type,chrom:a.view.chrom,low:null,high:null,dataset_id:a.dataset_id},function(b){if(!b||b=="error"){a.content_div.addClass("error").text(DATA_ERROR)}else{if(b=="no data"){a.content_div.addClass("nodata").text(DATA_NONE)}else{a.content_div.css("height",a.height_px+"px");a.min_value=b.min;a.max_value=b.max;a.vertical_range=a.max_value-a.min_value;a.view.redraw()}}})},get_data:f
unction(d,b){var c=this,a=b*DENSITY*d,f=(b+1)*DENSITY*d,e=d+"_"+b;$.getJSON(data_url,{track_type:this.track_type,chrom:this.view.chrom,low:a,high:f,dataset_id:this.dataset_id},function(g){c.cache[e]=g;$(document).trigger("redraw")})},draw_tile:function(d,a,m,o){if(!this.vertical_range){return}var h=a*DENSITY*d,b=DENSITY*d,c=$("<canvas class='tile'></canvas>"),l=d+"_"+a;if(!this.cache[l]){this.get_data(d,a);return}var g=this.cache[l];c.css({position:"absolute",top:0,left:(h-this.view.low)*o});c.get(0).width=Math.ceil(b*o);c.get(0).height=this.height_px;var n=c.get(0).getContext("2d");var e=false;n.beginPath();for(var f=0;f<g.length-1;f++){var k=g[f][0]-h;var j=g[f][1];if(isNaN(j)){e=false}else{k=k*o;j=(j-this.min_value)/this.vertical_range*this.height_px;if(e){n.lineTo(k,j)}else{n.moveTo(k,j);e=true}}}n.stroke();m.append(c);return c}});var FeatureTrack=function(c,b,a){Track.call(this,c,$("#viewport"));TiledTrack.call(this);this.track_type="feature";this.height_px=(a?a:100);th
is.container_div.addClass("feature-track");this.dataset_id=b;this.zo_slots={};this.show_labels_scale=0.001;this.showing_labels=false;this.vertical_gap=10};$.extend(FeatureTrack.prototype,TiledTrack.prototype,{init:function(){var a=this;$.getJSON(data_url,{track_type:a.track_type,low:a.view.max_low,high:a.view.max_high,dataset_id:a.dataset_id,chrom:a.view.chrom},function(b){if(b=="error"){a.content_div.addClass("error").text(DATA_ERROR)}else{if(b.length===0||b=="no data"){a.content_div.addClass("nodata").text(DATA_NONE)}else{a.content_div.css("height",a.height_px+"px");a.values=b;a.calc_slots();a.slots=a.zo_slots;a.draw()}}})},calc_slots:function(o){var c=[],b=this.container_div.width()/(this.view.high-this.view.low),g=this.show_labels_scale,a=this.view.max_high,e=this.view.max_low;if(o){this.zi_slots={}}var m=$("<canvas></canvas>").get(0).getContext("2d");for(var f=0,h=this.values.length;f<h;f++){var k,l,n=this.values[f];if(o){k=Math.floor(Math.max(e,(n.start-e)*g));k-=m.mea
sureText(n.name).width;l=Math.ceil(Math.min(a,(n.end-e)*g))}else{k=Math.floor(Math.max(e,(n.start-e)*b));l=Math.ceil(Math.min(a,(n.end-e)*b))}var d=0;while(true){if(c[d]===undefined||c[d]<k){c[d]=l;if(o){this.zi_slots[n.name]=d}else{this.zo_slots[n.name]=d}break}d++}}this.height_px=c.length*this.vertical_gap+15;this.content_div.css("height",this.height_px+"px")},draw_tile:function(t,y,g,n){if(!this.values){return null}if(n>this.show_labels_scale&&!this.showing_labels){this.showing_labels=true;if(!this.zi_slots){this.calc_slots(true)}this.slots=this.zi_slots}else{if(n<=this.show_labels_scale&&this.showing_labels){this.showing_labels=false;this.slots=this.zo_slots}}var z=y*DENSITY*t,b=(y+1)*DENSITY*t,o=DENSITY*t;var r=Math.ceil(o*n),q=this.height_px,p=$("<canvas class='tile'></canvas>");p.css({position:"absolute",top:0,left:(z-this.view.low)*n});p.get(0).width=r;p.get(0).height=q;var s=p.get(0).getContext("2d");s.fillStyle="#000";s.font="10px monospace";s.textAlign="right";var
v=0;for(var w=0,x=this.values.length;w<x;w++){var f=this.values[w];if(f.start<=b&&f.end>=z){var e=Math.floor(Math.max(0,(f.start-z)*n)),h=Math.ceil(Math.min(r,(f.end-z)*n)),d=this.slots[f.name]*this.vertical_gap;s.fillRect(e,d+5,h-e,1);if(this.showing_labels&&s.fillText){s.fillText(f.name,e,d+8)}var c,D;if(f.exon_start&&f.exon_end){c=Math.floor(Math.max(0,(f.exon_start-z)*n));D=Math.ceil(Math.min(r,(f.exon_end-z)*n));s.fillRect(c,d+4,D-c,3)}if(f.blocks){for(var u=0,B=f.blocks.length;u<B;u++){var m=f.blocks[u],l=Math.floor(Math.max(0,(m[0]-z)*n)),A=Math.ceil(Math.min(r,(m[1]-z)*n));var a=3,C=4;if(c&&l>=c&&A<=D){a=5;C=3}s.fillRect(l,d+C,A-l,a)}}v++}}g.append(p);return p}});
\ No newline at end of file
diff -r 61684a4d6b9e -r 52fadf5aabd9 static/scripts/trackster.js
--- a/static/scripts/trackster.js Tue Oct 20 15:34:47 2009 -0400
+++ b/static/scripts/trackster.js Tue Oct 20 16:46:12 2009 -0400
@@ -5,17 +5,26 @@
var DENSITY = 1000,
DATA_ERROR = "There was an error in indexing this dataset.",
DATA_NONE = "No data for this chrom/contig.",
- CACHED_TILES = 200,
+ CACHED_TILES = 50,
CACHED_DATA = 20;
-var View = function( chrom, max_length ) {
+function commatize( number ) {
+ number += ''; // Convert to string
+ var rgx = /(\d+)(\d{3})/;
+ while (rgx.test(number)) {
+ number = number.replace(rgx, '$1' + ',' + '$2');
+ }
+ return number;
+}
+var View = function( chrom, max_high ) {
this.chrom = chrom;
this.tracks = [];
this.max_low = 0;
- this.max_high = max_length;
- this.low = this.max_low;
- this.high = this.max_high;
- this.length = this.max_high - this.max_low;
+ this.max_high = max_high;
+ this.center = (this.max_high - this.max_low) / 2;
+ this.span = this.max_high - this.max_low;
+ this.zoom_factor = 2;
+ this.zoom_level = 0;
};
$.extend( View.prototype, {
add_track: function ( track ) {
@@ -24,82 +33,55 @@
if (track.init) { track.init(); }
},
redraw: function () {
+ var span = this.span / Math.pow(this.zoom_factor, this.zoom_level),
+ low = this.center - (span / 2),
+ high = low + span;
+
+ if (low < 0) {
+ low = 0;
+ high = low + span;
+
+ } else if (high > this.max_high) {
+ high = this.max_high;
+ low = high - span;
+ }
+ this.low = Math.floor(low);
+ this.high = Math.ceil(high);
+ this.center = Math.round( this.low + (this.high - this.low) / 2 );
+
// Overview
$("#overview-box").css( {
- left: ( this.low / this.length ) * $("#overview-viewport").width(),
- width: Math.max( 4, ( ( this.high - this.low ) / this.length ) * $("#overview-viewport").width() )
+ left: ( this.low / this.span ) * $("#overview-viewport").width(),
+ // Minimum width for usability
+ width: Math.max( 12, ( ( this.high - this.low ) / this.span ) * $("#overview-viewport").width() )
}).show();
- $("#low").text( this.low );
- $("#high").text( this.high );
+ $("#low").text( commatize(this.low) );
+ $("#high").text( commatize(this.high) );
for ( var i = 0, len = this.tracks.length; i < len; i++ ) {
this.tracks[i].draw();
}
$("#bottom-spacer").remove();
$("#viewport").append('<div id="bottom-spacer" style="height: 200px;"></div>');
},
- move: function ( new_low, new_high ) {
- this.low = Math.max( this.max_low, Math.floor( new_low ) );
- this.high = Math.min( this.length, Math.ceil( new_high ) );
+ zoom_in: function ( point ) {
+ if (this.max_high === 0 || this.high - this.low < 5) {
+ return;
+ }
+
+ if ( point ) {
+ this.center = point / $(document).width() * (this.high - this.low) + this.low;
+ }
+ this.zoom_level += 1;
},
- zoom_in: function ( factor, point ) {
+ zoom_out: function () {
if (this.max_high === 0) {
return;
}
-
- var range = this.high - this.low;
- var diff = range / factor / 2;
- var center;
-
- if (point === undefined) {
- center = ( this.low + this.high ) / 2;
- } else {
- center = this.low + range * point / $(document).width();
+ if (this.zoom_level <= 0) {
+ this.zoom_level = 0;
+ return;
}
- this.low = Math.floor( center - diff );
- this.high = Math.ceil( center + diff );
- if (this.low < this.max_low) {
- this.low = this.max_low;
- this.high = range / factor;
- } else if (this.high > this.max_high) {
- this.high = this.max_high;
- this.low = this.max_high - range / factor;
- // console.log(this.high, this.low);
- }
- if (this.high - this.low < 1 ) {
- this.high = this.low + 1;
- }
- },
- zoom_out: function ( factor ) {
- if (this.max_high === 0) {
- return;
- }
- var center = ( this.low + this.high ) / 2;
- var range = this.high - this.low;
- var diff = range * factor / 2;
- this.low = Math.floor( Math.max( 0, center - diff ) );
- this.high = Math.ceil( Math.min( this.length, center + diff ) );
- },
- left: function( factor ) {
- var range = this.high - this.low;
- var diff = Math.floor( range / factor );
- if ( this.low - diff < 0 ) {
- this.low = 0;
- this.high = this.low + range;
- } else {
- this.low -= diff;
- this.high -= diff;
- }
- },
- right: function ( factor ) {
- var range = this.high - this.low;
- var diff = Math.floor( range / factor );
- if ( this.high + diff > this.length ) {
- this.high = this.length;
- this.low = this.high - range;
- } else {
- this.low += diff;
- this.high += diff;
- }
+ this.zoom_level -= 1;
}
});
@@ -143,7 +125,7 @@
var tile_index = Math.floor( low / resolution / DENSITY );
while ( ( tile_index * DENSITY * resolution ) < high ) {
// Check in cache
- var key = w_scale + "_" + tile_index;
+ var key = this.view.zoom_level + "_" + tile_index;
if ( this.tile_cache[key] ) {
// console.log("cached tile");
tile_element = this.tile_cache[key];
@@ -178,7 +160,7 @@
new_div = $("<div style='position: relative; height: 1.3em;'></div>");
while ( position < view.high ) {
var screenPosition = ( position - view.low ) / range * width;
- new_div.append( $("<div class='label'>" + position + "</div>").css( {
+ new_div.append( $("<div class='label'>" + commatize( position ) + "</div>").css( {
position: "absolute",
// Reduce by one to account for border
left: screenPosition - 1
@@ -231,7 +213,7 @@
$(document).trigger( "redraw" );
});
},
- draw_tile: function( resolution, tile_index, parent_element, w_scale, h_scale ) {
+ draw_tile: function( resolution, tile_index, parent_element, w_scale ) {
if (!this.vertical_range) { // We don't have the necessary information yet
return;
}
@@ -354,11 +336,11 @@
this.height_px = end_ary.length * this.vertical_gap + 15;
this.content_div.css( "height", this.height_px + "px" );
},
- draw_tile: function( resolution, tile_index, parent_element, w_scale, h_scale ) {
+ draw_tile: function( resolution, tile_index, parent_element, w_scale ) {
if (!this.values) { // Still loading
return null;
}
- // Once we zoom in enough, show name labels
+ // Once we zoom in enough, show name labels
if (w_scale > this.show_labels_scale && !this.showing_labels) {
this.showing_labels = true;
if (!this.zi_slots) {
@@ -373,9 +355,9 @@
var tile_low = tile_index * DENSITY * resolution,
tile_high = ( tile_index + 1 ) * DENSITY * resolution,
- tile_length = DENSITY * resolution;
+ tile_span = DENSITY * resolution;
// console.log(tile_low, tile_high, tile_length, w_scale);
- var width = Math.ceil( tile_length * w_scale ),
+ var width = Math.ceil( tile_span * w_scale ),
height = this.height_px,
new_canvas = $("<canvas class='tile'></canvas>");
diff -r 61684a4d6b9e -r 52fadf5aabd9 static/trackster.css
--- a/static/trackster.css Tue Oct 20 15:34:47 2009 -0400
+++ b/static/trackster.css Tue Oct 20 16:46:12 2009 -0400
@@ -2,16 +2,17 @@
margin: 0 0;
padding: 0;
font-family: verdana;
- font-size: 75%;
+ font-size: 12px;
overflow-y: hidden;
}
#content {
width: 100%;
+ height: 100%;
}
-#center {
- overflow: auto;
+#center {
+ overflow: auto;
}
#nav {
@@ -36,26 +37,35 @@
-webkit-border-radius: 1em;
-moz-border-radius: 1em;
}
+#low {
+ margin-left: 0.1em;
+ float: left;
+}
+#high {
+ margin-right: 0.1em;
+ float: right;
+}
#overview {
width: 100%;
- padding: 10px 0 0 0;
margin: 0px;
background: #333;
color: white;
- font-weight: bold;
}
#overview-viewport {
- height: 10px;
- border-top: solid #666 1px;
- border-bottom: solid #666 1px;
+ height: 20px;
+/* border-top: solid #666 1px;*/
+/* border-bottom: solid #aaa 1px;*/
background: #888;
}
#overview-box {
position: absolute;
- height: 10px;
- background: #ddd;
+ margin-top: 1px;
+ height: 16px;
+ background: #ddd url(/static/images/draggable_horizontal.png) center center no-repeat;
+ border-style: outset;
+ border-width: 1px;
}
#viewport {
@@ -100,6 +110,12 @@
min-height: 100px;
}
+.label-track {
+ font-weight: bold;
+ font-size: 10px;
+ background-color: #333;
+ color: white;
+}
.label-track .label {
border-left: solid gray 1px;
padding-left: 2px;
diff -r 61684a4d6b9e -r 52fadf5aabd9 templates/tracks/browser.mako
--- a/templates/tracks/browser.mako Tue Oct 20 15:34:47 2009 -0400
+++ b/templates/tracks/browser.mako Tue Oct 20 16:46:12 2009 -0400
@@ -16,7 +16,7 @@
$(function() {
- view.add_track( new LabelTrack( $("#overview" ) ) );
+ view.add_track( new LabelTrack( $("#viewport" ) ) );
view.add_track( new LabelTrack( $("#nav-labeltrack" ) ) );
%for track in tracks:
@@ -27,41 +27,35 @@
view.redraw();
});
- $(document).bind("mousewheel", function(e, delta) {
+ $(document).bind("mousewheel", function( e, delta ) {
if (delta > 0) {
- view.zoom_in(2, e.pageX);
+ view.zoom_in(e.pageX);
view.redraw();
} else {
- view.zoom_out(2);
+ view.zoom_out();
view.redraw();
}
});
- $(document).bind("dblclick", function(e) {
- view.zoom_in(2, e.pageX);
+ $(document).bind("dblclick", function( e ) {
+ view.zoom_in(e.pageX);
view.redraw();
});
- $("#overview-box").bind("dragstart", function(e) {
+ // To let the overview box be draggable
+ $("#overview-box").bind("dragstart", function( e ) {
this.current_x = e.offsetX;
- }).bind("drag", function(e) {
+ }).bind("drag", function( e ) {
var delta = e.offsetX - this.current_x;
this.current_x = e.offsetX;
- var delta_chrom = Math.round(delta / $(document).width() * (view.max_high - view.max_low));
- var view_range = view.high - view.low;
-
- var new_low = view.low += delta_chrom;
- var new_high = view.high += delta_chrom;
- if (new_low < view.max_low) {
- new_low = 0;
- new_high = view_range;
- } else if (new_high > view.max_high) {
- new_high = view.max_high;
- new_low = view.max_high - view_range;
+ var delta_chrom = Math.round(delta / $(document).width() * view.span);
+ view.center += delta_chrom;
+ if (view.center < 0) {
+ view.center = 0;
+ } else if (view.center > view.max_high) {
+ view.center = view.max_high;
}
- view.low = new_low;
- view.high = new_high;
view.redraw();
});
@@ -70,11 +64,12 @@
view.redraw();
});
- $("#viewport").bind( "dragstart", function ( e ) {
+ $("#viewport").bind( "dragstart", function( e ) {
this.original_low = view.low;
this.current_height = e.clientY;
+ this.current_x = e.offsetX;
}).bind( "drag", function( e ) {
- var move_amount = ( e.offsetX - this.offsetLeft ) / this.offsetWidth;
+ var delta = e.offsetX - this.current_x;
var new_scroll = $(this).scrollTop() - (e.clientY - this.current_height);
if ( new_scroll < $(this).get(0).scrollHeight - $(this).height() - 200) {
@@ -82,19 +77,10 @@
}
this.current_height = e.clientY;
- var range = view.high - view.low;
- var move_bases = Math.round( range * move_amount );
- var new_low = this.original_low - move_bases;
- if ( new_low < 0 ) {
- new_low = 0;
- }
- var new_high = new_low + range;
- if ( new_high > view.length ) {
- new_high = view.length;
- new_low = new_high - range;
- }
- view.low = new_low;
- view.high = new_high;
+ this.current_x = e.offsetX;
+
+ var delta_chrom = Math.round(delta / $(document).width() * (view.max_high - view.max_low));
+ view.center += delta_chrom;
view.redraw();
});
(function () {
@@ -122,28 +108,25 @@
</%def>
<div id="content">
+ <div id="viewport"></div>
+</div>
+<div id="nav">
+ <div id="nav-labeltrack"></div>
<div id="overview">
<div id="overview-viewport">
<div id="overview-box"></div>
</div>
</div>
- <div id="viewport"></div>
-</div>
-<div id="nav">
- <div id="nav-labeltrack"></div>
+ <span id="low"></span>
+ <span id="high"></span>
<div id="nav-controls">
<form name="chr" id="chr" method="get">
<select id="chrom" name="chrom">
<option value="">Loading</option>
</select>
<input type="hidden" name="dataset_ids" value="${dataset_ids}" />
- <a href="#" onclick="javascript:view.left(5);view.redraw();"><<</a>
- <a href="#" onclick="javascript:view.left(2);view.redraw();"><</a>
- <span id="low">0</span>—<span id="high">${LEN}</span>
- <a href="#" onclick="javascript:view.zoom_in(2);view.redraw();">+</a>
- <a href="#" onclick="javascript:view.zoom_out(2);view.redraw();">-</a>
- <a href="#" onclick="javascript:view.right(2);view.redraw();">></a>
- <a href="#" onclick="javascript:view.right(5);view.redraw();">>></a>
+ <a href="#" onclick="javascript:view.zoom_in();view.redraw();">+</a>
+ <a href="#" onclick="javascript:view.zoom_out();view.redraw();">-</a>
</form>
</div>
</div>
1
0
27 Oct '09
details: http://www.bx.psu.edu/hg/galaxy/rev/ed8ea4318e9a
changeset: 2897:ed8ea4318e9a
user: Kanwei Li <kanwei(a)gmail.com>
date: Tue Oct 20 16:53:18 2009 -0400
description:
add image for trackster, use relative image url in css
2 file(s) affected in this change:
static/images/draggable_horizontal.png
static/trackster.css
diffs (14 lines):
diff -r 52fadf5aabd9 -r ed8ea4318e9a static/images/draggable_horizontal.png
Binary file static/images/draggable_horizontal.png has changed
diff -r 52fadf5aabd9 -r ed8ea4318e9a static/trackster.css
--- a/static/trackster.css Tue Oct 20 16:46:12 2009 -0400
+++ b/static/trackster.css Tue Oct 20 16:53:18 2009 -0400
@@ -63,7 +63,7 @@
position: absolute;
margin-top: 1px;
height: 16px;
- background: #ddd url(/static/images/draggable_horizontal.png) center center no-repeat;
+ background: #ddd url(images/draggable_horizontal.png) center center no-repeat;
border-style: outset;
border-width: 1px;
}
1
0