Problems with Galaxy on a mapped drive
by Peter Cock
Hi all,
In my recent email I mentioned problems with our setup and mapped drives. I
am running a test Galaxy on a server under a CIFS mapped drive. If I map the
drive with noperms then things seem to work with submitting jobs to the cluster
etc, but that doesn't seem secure at all. Mounting with strict permissions seems
to cause various network latency related problems in Galaxy though.
Specifically during loading the converters and history export tool,
Galaxy creates
a temporary XML file which it then tries to parse. I was able to resolve this by
switching from tempfile.TemporaryFile to tempfile.mkstemp and adding a 1s
sleep, but it isn't very elegant. Couldn't you use a StringIO handle instead?
Later during start up there were two errors with a similar issue -
Galaxy creates
a temp folder then immediately tries to write a tar ball or zip file
to it. Again,
adding a 1 second sleep after creating the directory before using it seems to
work. See lib/galaxy/web/controllers/dataset.py
After that Galaxy started, but still gives problems - like the issue
reported here
which Galaxy handled badly (see patch):
http://lists.bx.psu.edu/pipermail/galaxy-dev/2011-July/006213.html
Here again, inserting a one second sleep between writing the cluster script
file and setting its permissions made it work.
If those are the only issues, that can be dealt with. But are there likely to be
lots more similar problems of this nature later on? That is my worry.
How are most people setting up mapped drives for Galaxy with a cluster?
Thanks,
Peter
10 years, 11 months
job runner issue
by shashi shekhar
Hi
In my galaxy instance, whatever jobs i am submitting it goes into queued
state.
If I restart the server then the previous submitted jobs state changes to
running. but the newly submitted jobs again goes to queued state.
I am at a loss to understand this behaviour of galaxy and unable to debug
it. The job submission uses a customized runner.
How is it actually goes into the queued state automatically when all the
workers thread are free?
Does galaxy_session table is_valid attribute makes jobs state true?
Or what all places in tables the queued states are getting stored. I can
only see that the jobs table state attribute only stores the state.
The server logs points error here:
galaxy.jobs ERROR 2011-07-29 11:01:28,098 failure running job 2243
Traceback (most recent call last):
File "/home/gwadmin/galaxy-central/lib/galaxy/jobs/__init__.py", line 202,
in __monitor_step
self.dispatcher.put( JobWrapper( job, self ) )
File "/home/gwadmin/galaxy-central/lib/galaxy/jobs/__init__.py", line 856,
in put
self.job_runners[runner_name].put( job_wrapper )
File "/home/gwadmin/galaxy-central/lib/galaxy/jobs/runners/gw.py", line
375, in put
job_wrapper.change_state( model.Job.states.QUEUED )
File "/home/gwadmin/galaxy-central/lib/galaxy/jobs/__init__.py", line 437,
in change_state
self.sa_session.flush()
Regards
Karuna
10 years, 11 months
Could anyone tell me why the result file in directary /galaxy-dist /database/files/000 do not return to the web page
by liyanhui607
Dear Sir,
We have a program writen in perl. It can run nomally in linux environment with a file saved as result.
After added it to galaxy system, we found that it can produce the right result file in directary "/var/we/galaxy-dist/database/files/000" , but the result file did not return to the web page where the state is always "Job is currently running".
Could anyone tell me what is wrong ? why it does not work?
The bagging-SVM.xml and the bagging-SVM.pl are attached.
Thank you very much!
Best Wishes!
Yan-Hui Li
10 years, 11 months
Galaxy: self-cleaning on file_path
by Sanka, Ravi
Hello,
We have recently put a local installation of Galaxy on our network and connected it to a MYSQL database. In order to have more space available for uploaded and created files, we also changed the values of file_path and new_file_path in the config file. The file_path field was changed to a larger, communal location. The new_file_path was changed to a directory called "tmp" in that same communal location.
This location, however, undergoes self-cleaning; deleting any files that reach 7 days old (and any directories that have no files less than 7 days old). This would include any datasets uploaded or created by Galaxy.
Can the Galaxy framework and schema handle this occurrence? Or will it cause errors to occur?
-----------------------------------------------
Ravi Sanka
ICS - Bioinformatics Engineer
J. Craig Venter Institute
301-795-7743
-----------------------------------------------
10 years, 11 months
database problem
by shashi shekhar
Hi ALL,
I want to delete some data from history_dataset_association. Will it
effect any other tables.and I want to delete records for last 3 days by
sqlite query manually.How may tables will be effected by this and provide me
tables name .How can i delete the records.
Regards
shashi shekhar
10 years, 11 months
Cluster setup - shared temporary directory
by Peter Cock
Hi all,
I'm reading http://wiki.g2.bx.psu.edu/Admin/Config/Performance/Cluster
Could someone expand a little on this section please:
> Create a shared temporary directory
>
> Some tools make use of temporary files created on the server,
> but accessed on the nodes. For this, you'll need to make a
> directory (galaxy_dist/database/tmp by default) ...
I presume this is talking about the universe_wsgi.ini setting
new_file_path = database/tmp (if so, could that be explicit)?
I would like to know more about this from the tool author point
of view. Could you at least give one example of a tool that uses
this temporary folder? As a tool author I am unclear what the
purpose is (and it would be a shock if I accidentally use this
mapped folder instead of the local temp drive of a node).
Thanks,
Peter
10 years, 11 months
Re: [galaxy-dev] galaxy-dev Digest, Vol 61, Issue 10 - Handling runner-specific (SLURM) DRMAA settings (Roman Valls)
by Mariusz Mamoński
> The general config file allows us to set a fixed project:
>
> default_cluster_job_runner = drmaa://-A a2010002 -p core
>
> And even set per-tool job settings. But we would like each user to have
> the ability to change those settings.
>
>
> What is the least intrusive way to set per-user native (site-specific)
> job manager settings ?
you may try to use an user's local DRMAA configuration file:
~/.slurm_drmaa.conf
". If multiple configuration sources are present then all
configurations are merged with values from user-defined files taking
precedence (in following order: $SLURM_DRMAA_CONF,
~/.slurm_drmaa.conf, /etc/slurm_drmaa.conf)"
there you can put any user's specific settings, e.g.:
job_categories: {
default: "-A a2010002 -p core",
}
Cheers,
--
Mariusz
10 years, 11 months
Database scheme migration.
by michael burrell (TSL)
Good afternoon
I am having some trouble updating our local galaxy, in summary it seems to fail. I have included the errors below and would really appreciate any assistance that could be offered.
galaxy@jic55119:~/software/galaxy-ceneral$ sh manage_db.sh upgrade
57 -> 58...
Migration script to create table for exporting histories to archives.
(ProgrammingError) there is no unique constraint matching given keys for referenced table "job"
'\nCREATE TABLE job_export_history_archive (\n\tid SERIAL NOT NULL, \n\tjob_id INTEGER, \n\thistory_id INTEGER, \n\tdataset_id INTEGER, \n\tcompressed BOOLEAN, \n\thistory_attrs_filename TEXT, \n\tdatasets_attrs_filename TEXT, \n\tjobs_attrs_filename TEXT, \n\tPRIMARY KEY (id), \n\t FOREIGN KEY(job_id) REFERENCES job (id), \n\t FOREIGN KEY(history_id) REFERENCES history (id), \n\t FOREIGN KEY(dataset_id) REFERENCES dataset (id)\n)\n\n' {}
done
58 -> 59...
….INTERMEDIATE STEPS REMOVED NO ERRORS REPORTED……
78 -> 79...
Migration script to add the job_to_input_library_dataset table.
Creating job_to_input_library_dataset table failed: (ProgrammingError) there is no unique constraint matching given keys for referenced table "job"
'\nCREATE TABLE job_to_input_library_dataset (\n\tid SERIAL NOT NULL, \n\tjob_id INTEGER, \n\tldda_id INTEGER, \n\tname VARCHAR(255), \n\tPRIMARY KEY (id), \n\t FOREIGN KEY(job_id) REFERENCES job (id), \n\t FOREIGN KEY(ldda_id) REFERENCES library_dataset_dataset_association (id)\n)\n\n' {}
Done
Further information
galaxy@jic55119:~/software/galaxy-ceneral$ hg tip
changeset: 5793:f2638528e904
tag: tip
user: jeremy goecks <jeremy.goecks(a)emory.edu>
date: Thu Jul 14 22:54:43 2011 +0200
summary: Fix bug in trackster filtering of GFF datasets.
galaxy@jic55119:~/software/galaxy-ceneral$ python -V
Python 2.6.6
galaxy@jic55119:~/software/galaxy-ceneral$ psql --version
psql (PostgreSQL) 8.4.8
10 years, 11 months
Python bug?
by David Hoover
We've run into an error with the SAM to BAM converter.
An error occurred running this job: Samtools Version: 0.1.15 (r949:203)
Error extracting alignments from (/gs1/users/galaxy/pro/database/files/000/dataset_285.dat), requested number of bytes is more than a Python string can hold
We are running 32-bit Python. Would this be the problem, or do we need a newer version of samtools?
David Hoover
Helix Systems Staff
http://helix.nih.gov
10 years, 11 months
Error after migrating to latest changeset
by michael burrell (TSL)
Good Morning,
I am getting a big problem with our production server when I upgrade to the latest change set (it was quite out of date). I have attached quite a long paster output to try and show whats going on,
Our galaxy instance no longer seems to run jobs at all, they are not allocated a job runner and the following is given in the paster log
I look forwards to your assistance.
Thanks
Michael.
======
galaxy.datatypes.registry DEBUG 2011-07-12 11:46:50,433 Loaded indexer: INDEXER_Interval_0
galaxy.jobs.runners.local INFO 2011-07-12 11:46:50,436 starting workers
galaxy.jobs.runners.local DEBUG 2011-07-12 11:46:50,437 5 workers ready
galaxy.jobs DEBUG 2011-07-12 11:46:50,438 Loaded job runner: galaxy.jobs.runners.local:LocalJobRunner
galaxy.jobs.runners.drmaa DEBUG 2011-07-12 11:46:50,483 10 workers ready
galaxy.jobs DEBUG 2011-07-12 11:46:50,483 Loaded job runner: galaxy.jobs.runners.drmaa:DRMAAJobRunner
galaxy.jobs INFO 2011-07-12 11:46:50,484 job manager started
galaxy.jobs DEBUG 2011-07-12 11:46:50,861 no runner: 4400 is still in new state, adding to the jobs queue
galaxy.jobs DEBUG 2011-07-12 11:46:50,862 no runner: 4401 is still in new state, adding to the jobs queue
galaxy.jobs DEBUG 2011-07-12 11:46:50,862 no runner: 4403 is still in new state, adding to the jobs queue
galaxy.jobs DEBUG 2011-07-12 11:46:50,862 no runner: 4404 is still in new state, adding to the jobs queue
galaxy.jobs INFO 2011-07-12 11:46:50,869 job stopper started
galaxy.sample_tracking.external_service_types DEBUG 2011-07-12 11:46:50,872 Loaded external_service_type: Simple unknown sequencer 1.0.0
galaxy.sample_tracking.external_service_types DEBUG 2011-07-12 11:46:50,875 Loaded external_service_type: Applied Biosystems SOLiD 1.0.0
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,895 Enabling 'admin' controller, class: AdminGalaxy
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,898 Enabling 'async' controller, class: ASync
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,915 Enabling 'dataset' controller, class: DatasetInterface
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,917 Enabling 'error' controller, class: Error
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,920 Enabling 'forms' controller, class: Forms
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,924 Enabling 'history' controller, class: HistoryController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,942 Enabling 'library' controller, class: Library
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,946 Enabling 'library_admin' controller, class: LibraryAdmin
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,947 Enabling 'library_common' controller, class: LibraryCommon
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,949 Enabling 'mobile' controller, class: Mobile
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,952 Enabling 'page' controller, class: PageController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,958 Enabling 'requests' controller, class: Requests
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,996 Enabling 'requests_admin' controller, class: RequestsAdmin
galaxy.web.framework.base DEBUG 2011-07-12 11:46:50,997 Enabling 'requests_common' controller, class: RequestsCommon
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,000 Enabling 'root' controller, class: RootController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,002 Enabling 'tag' controller, class: TagsController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,004 Enabling 'tool_runner' controller, class: ToolRunner
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,013 Enabling 'tracks' controller, class: TracksController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,016 Enabling 'ucsc_proxy' controller, class: UCSCProxy
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,019 Enabling 'user' controller, class: User
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,022 Enabling 'visualization' controller, class: VisualizationController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,107 Enabling 'workflow' controller, class: WorkflowController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,110 Enabling 'external_service' controller, class: ExternalService
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,112 Enabling 'external_services' controller, class: ExternalServiceController
galaxy.web.framework.base DEBUG 2011-07-12 11:46:51,115 Enabling 'request_type' controller, class: RequestType
galaxy.web.buildapp DEBUG 2011-07-12 11:46:51,146 Enabling 'httpexceptions' middleware
galaxy.web.buildapp DEBUG 2011-07-12 11:46:51,148 Enabling 'recursive' middleware
galaxy.web.buildapp DEBUG 2011-07-12 11:46:51,169 Enabling 'error' middleware
galaxy.web.buildapp DEBUG 2011-07-12 11:46:51,173 Enabling 'trans logger' middleware
galaxy.web.buildapp DEBUG 2011-07-12 11:46:51,173 Enabling 'config' middleware
galaxy.web.buildapp DEBUG 2011-07-12 11:46:51,174 Enabling 'x-forwarded-host' middleware
Starting server in PID 3738.
serving on http://127.0.0.1:8080
sqlalchemy.pool.QueuePool.0x...b590 WARNING 2011-07-12 11:47:00,942 Error closing cursor: current transaction is aborted, commands ignored until end of transaction block
galaxy.jobs ERROR 2011-07-12 11:47:00,942 failure running job 4400
Traceback (most recent call last):
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 187, in __monitor_step
job_state = self.__check_if_ready_to_run( job )
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 232, in __check_if_ready_to_run
for dataset_assoc in job.input_datasets + job.input_library_datasets:
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 158, in __get__
return self.impl.get(instance_state(instance), instance_dict(instance))
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 377, in get
value = callable_()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/strategies.py", line 587, in __call__
result = q.all()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1267, in all
return list(self)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1361, in __iter__
return self._execute_and_instances(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1364, in _execute_and_instances
result = self.session.execute(querycontext.statement, params=self._params, mapper=self._mapper_zero_or_none())
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/session.py", line 755, in execute
clause, params or {})
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 824, in execute
return Connection.executors[c](self, object, multiparams, params)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 874, in _execute_clauseelement
return self.__execute_context(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 896, in __execute_context
self._cursor_execute(context.cursor, context.statement, context.parameters[0], context=context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 950, in _cursor_execute
self._handle_dbapi_exception(e, statement, parameters, cursor, context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 931, in _handle_dbapi_exception
raise exc.DBAPIError.instance(statement, parameters, e, connection_invalidated=is_disconnect)
ProgrammingError: (ProgrammingError) relation "job_to_input_library_dataset" does not exist
'SELECT job_to_input_library_dataset.id AS job_to_input_library_dataset_id, job_to_input_library_dataset.job_id AS job_to_input_library_dataset_job_id, job_to_input_library_dataset.ldda_id AS job_to_input_library_dataset_ldda_id, job_to_input_library_dataset.name AS job_to_input_library_dataset_name, library_dataset_dataset_association_1.id AS library_dataset_dataset_association_1_id, library_dataset_dataset_association_1.library_dataset_id AS library_dataset_dataset_association_1_library_dataset_id, library_dataset_dataset_association_1.dataset_id AS library_dataset_dataset_association_1_dataset_id, library_dataset_dataset_association_1.create_time AS library_dataset_dataset_association_1_create_time, library_dataset_dataset_association_1.update_time AS library_dataset_dataset_association_1_update_time, library_dataset_dataset_association_1.state AS library_dataset_dataset_association_1_state, library_dataset_dataset_association_1.copied_from_history_dataset_association_id AS library_dataset_dataset_association_1_copied_from_history_1, library_dataset_dataset_association_1.copied_from_library_dataset_dataset_association_id AS library_dataset_dataset_association_1_copied_from_library_2, library_dataset_dataset_association_1.name AS library_dataset_dataset_association_1_name, library_dataset_dataset_association_1.info AS library_dataset_dataset_association_1_info, library_dataset_dataset_association_1.blurb AS library_dataset_dataset_association_1_blurb, library_dataset_dataset_association_1.peek AS library_dataset_dataset_association_1_peek, library_dataset_dataset_association_1.extension AS library_dataset_dataset_association_1_extension, library_dataset_dataset_association_1.metadata AS library_dataset_dataset_association_1_metadata, library_dataset_dataset_association_1.parent_id AS library_dataset_dataset_association_1_parent_id, library_dataset_dataset_association_1.designation AS library_dataset_dataset_association_1_designation, library_dataset_dataset_association_1.deleted AS library_dataset_dataset_association_1_deleted, library_dataset_dataset_association_1.visible AS library_dataset_dataset_association_1_visible, library_dataset_dataset_association_1.user_id AS library_dataset_dataset_association_1_user_id, library_dataset_dataset_association_1.message AS library_dataset_dataset_association_1_message \nFROM job_to_input_library_dataset LEFT OUTER JOIN library_dataset_dataset_association AS library_dataset_dataset_association_1 ON library_dataset_dataset_association_1.id = job_to_input_library_dataset.ldda_id \nWHERE %(param_1)s = job_to_input_library_dataset.job_id' {'param_1': 4400}
sqlalchemy.pool.QueuePool.0x...b590 WARNING 2011-07-12 11:47:00,978 Error closing cursor: current transaction is aborted, commands ignored until end of transaction block
galaxy.jobs ERROR 2011-07-12 11:47:00,979 failure running job 4401
Traceback (most recent call last):
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 187, in __monitor_step
job_state = self.__check_if_ready_to_run( job )
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 232, in __check_if_ready_to_run
for dataset_assoc in job.input_datasets + job.input_library_datasets:
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 158, in __get__
return self.impl.get(instance_state(instance), instance_dict(instance))
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 377, in get
value = callable_()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/strategies.py", line 587, in __call__
result = q.all()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1267, in all
return list(self)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1361, in __iter__
return self._execute_and_instances(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1364, in _execute_and_instances
result = self.session.execute(querycontext.statement, params=self._params, mapper=self._mapper_zero_or_none())
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/session.py", line 755, in execute
clause, params or {})
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 824, in execute
return Connection.executors[c](self, object, multiparams, params)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 874, in _execute_clauseelement
return self.__execute_context(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 896, in __execute_context
self._cursor_execute(context.cursor, context.statement, context.parameters[0], context=context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 950, in _cursor_execute
self._handle_dbapi_exception(e, statement, parameters, cursor, context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 931, in _handle_dbapi_exception
raise exc.DBAPIError.instance(statement, parameters, e, connection_invalidated=is_disconnect)
ProgrammingError: (ProgrammingError) relation "job_to_input_library_dataset" does not exist
'SELECT job_to_input_library_dataset.id AS job_to_input_library_dataset_id, job_to_input_library_dataset.job_id AS job_to_input_library_dataset_job_id, job_to_input_library_dataset.ldda_id AS job_to_input_library_dataset_ldda_id, job_to_input_library_dataset.name AS job_to_input_library_dataset_name, library_dataset_dataset_association_1.id AS library_dataset_dataset_association_1_id, library_dataset_dataset_association_1.library_dataset_id AS library_dataset_dataset_association_1_library_dataset_id, library_dataset_dataset_association_1.dataset_id AS library_dataset_dataset_association_1_dataset_id, library_dataset_dataset_association_1.create_time AS library_dataset_dataset_association_1_create_time, library_dataset_dataset_association_1.update_time AS library_dataset_dataset_association_1_update_time, library_dataset_dataset_association_1.state AS library_dataset_dataset_association_1_state, library_dataset_dataset_association_1.copied_from_history_dataset_association_id AS library_dataset_dataset_association_1_copied_from_history_1, library_dataset_dataset_association_1.copied_from_library_dataset_dataset_association_id AS library_dataset_dataset_association_1_copied_from_library_2, library_dataset_dataset_association_1.name AS library_dataset_dataset_association_1_name, library_dataset_dataset_association_1.info AS library_dataset_dataset_association_1_info, library_dataset_dataset_association_1.blurb AS library_dataset_dataset_association_1_blurb, library_dataset_dataset_association_1.peek AS library_dataset_dataset_association_1_peek, library_dataset_dataset_association_1.extension AS library_dataset_dataset_association_1_extension, library_dataset_dataset_association_1.metadata AS library_dataset_dataset_association_1_metadata, library_dataset_dataset_association_1.parent_id AS library_dataset_dataset_association_1_parent_id, library_dataset_dataset_association_1.designation AS library_dataset_dataset_association_1_designation, library_dataset_dataset_association_1.deleted AS library_dataset_dataset_association_1_deleted, library_dataset_dataset_association_1.visible AS library_dataset_dataset_association_1_visible, library_dataset_dataset_association_1.user_id AS library_dataset_dataset_association_1_user_id, library_dataset_dataset_association_1.message AS library_dataset_dataset_association_1_message \nFROM job_to_input_library_dataset LEFT OUTER JOIN library_dataset_dataset_association AS library_dataset_dataset_association_1 ON library_dataset_dataset_association_1.id = job_to_input_library_dataset.ldda_id \nWHERE %(param_1)s = job_to_input_library_dataset.job_id' {'param_1': 4401}
sqlalchemy.pool.QueuePool.0x...b590 WARNING 2011-07-12 11:47:00,999 Error closing cursor: current transaction is aborted, commands ignored until end of transaction block
galaxy.jobs ERROR 2011-07-12 11:47:00,999 failure running job 4403
Traceback (most recent call last):
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 187, in __monitor_step
job_state = self.__check_if_ready_to_run( job )
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 232, in __check_if_ready_to_run
for dataset_assoc in job.input_datasets + job.input_library_datasets:
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 158, in __get__
return self.impl.get(instance_state(instance), instance_dict(instance))
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 377, in get
value = callable_()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/strategies.py", line 587, in __call__
result = q.all()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1267, in all
return list(self)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1361, in __iter__
return self._execute_and_instances(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1364, in _execute_and_instances
result = self.session.execute(querycontext.statement, params=self._params, mapper=self._mapper_zero_or_none())
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/session.py", line 755, in execute
clause, params or {})
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 824, in execute
return Connection.executors[c](self, object, multiparams, params)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 874, in _execute_clauseelement
return self.__execute_context(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 896, in __execute_context
self._cursor_execute(context.cursor, context.statement, context.parameters[0], context=context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 950, in _cursor_execute
self._handle_dbapi_exception(e, statement, parameters, cursor, context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 931, in _handle_dbapi_exception
raise exc.DBAPIError.instance(statement, parameters, e, connection_invalidated=is_disconnect)
ProgrammingError: (ProgrammingError) relation "job_to_input_library_dataset" does not exist
'SELECT job_to_input_library_dataset.id AS job_to_input_library_dataset_id, job_to_input_library_dataset.job_id AS job_to_input_library_dataset_job_id, job_to_input_library_dataset.ldda_id AS job_to_input_library_dataset_ldda_id, job_to_input_library_dataset.name AS job_to_input_library_dataset_name, library_dataset_dataset_association_1.id AS library_dataset_dataset_association_1_id, library_dataset_dataset_association_1.library_dataset_id AS library_dataset_dataset_association_1_library_dataset_id, library_dataset_dataset_association_1.dataset_id AS library_dataset_dataset_association_1_dataset_id, library_dataset_dataset_association_1.create_time AS library_dataset_dataset_association_1_create_time, library_dataset_dataset_association_1.update_time AS library_dataset_dataset_association_1_update_time, library_dataset_dataset_association_1.state AS library_dataset_dataset_association_1_state, library_dataset_dataset_association_1.copied_from_history_dataset_association_id AS library_dataset_dataset_association_1_copied_from_history_1, library_dataset_dataset_association_1.copied_from_library_dataset_dataset_association_id AS library_dataset_dataset_association_1_copied_from_library_2, library_dataset_dataset_association_1.name AS library_dataset_dataset_association_1_name, library_dataset_dataset_association_1.info AS library_dataset_dataset_association_1_info, library_dataset_dataset_association_1.blurb AS library_dataset_dataset_association_1_blurb, library_dataset_dataset_association_1.peek AS library_dataset_dataset_association_1_peek, library_dataset_dataset_association_1.extension AS library_dataset_dataset_association_1_extension, library_dataset_dataset_association_1.metadata AS library_dataset_dataset_association_1_metadata, library_dataset_dataset_association_1.parent_id AS library_dataset_dataset_association_1_parent_id, library_dataset_dataset_association_1.designation AS library_dataset_dataset_association_1_designation, library_dataset_dataset_association_1.deleted AS library_dataset_dataset_association_1_deleted, library_dataset_dataset_association_1.visible AS library_dataset_dataset_association_1_visible, library_dataset_dataset_association_1.user_id AS library_dataset_dataset_association_1_user_id, library_dataset_dataset_association_1.message AS library_dataset_dataset_association_1_message \nFROM job_to_input_library_dataset LEFT OUTER JOIN library_dataset_dataset_association AS library_dataset_dataset_association_1 ON library_dataset_dataset_association_1.id = job_to_input_library_dataset.ldda_id \nWHERE %(param_1)s = job_to_input_library_dataset.job_id' {'param_1': 4403}
sqlalchemy.pool.QueuePool.0x...b590 WARNING 2011-07-12 11:47:01,027 Error closing cursor: current transaction is aborted, commands ignored until end of transaction block
galaxy.jobs ERROR 2011-07-12 11:47:01,028 failure running job 4404
Traceback (most recent call last):
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 187, in __monitor_step
job_state = self.__check_if_ready_to_run( job )
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 232, in __check_if_ready_to_run
for dataset_assoc in job.input_datasets + job.input_library_datasets:
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 158, in __get__
return self.impl.get(instance_state(instance), instance_dict(instance))
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 377, in get
value = callable_()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/strategies.py", line 587, in __call__
result = q.all()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1267, in all
return list(self)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1361, in __iter__
return self._execute_and_instances(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1364, in _execute_and_instances
result = self.session.execute(querycontext.statement, params=self._params, mapper=self._mapper_zero_or_none())
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/session.py", line 755, in execute
clause, params or {})
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 824, in execute
return Connection.executors[c](self, object, multiparams, params)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 874, in _execute_clauseelement
return self.__execute_context(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 896, in __execute_context
self._cursor_execute(context.cursor, context.statement, context.parameters[0], context=context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 950, in _cursor_execute
self._handle_dbapi_exception(e, statement, parameters, cursor, context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 931, in _handle_dbapi_exception
raise exc.DBAPIError.instance(statement, parameters, e, connection_invalidated=is_disconnect)
ProgrammingError: (ProgrammingError) relation "job_to_input_library_dataset" does not exist
'SELECT job_to_input_library_dataset.id AS job_to_input_library_dataset_id, job_to_input_library_dataset.job_id AS job_to_input_library_dataset_job_id, job_to_input_library_dataset.ldda_id AS job_to_input_library_dataset_ldda_id, job_to_input_library_dataset.name AS job_to_input_library_dataset_name, library_dataset_dataset_association_1.id AS library_dataset_dataset_association_1_id, library_dataset_dataset_association_1.library_dataset_id AS library_dataset_dataset_association_1_library_dataset_id, library_dataset_dataset_association_1.dataset_id AS library_dataset_dataset_association_1_dataset_id, library_dataset_dataset_association_1.create_time AS library_dataset_dataset_association_1_create_time, library_dataset_dataset_association_1.update_time AS library_dataset_dataset_association_1_update_time, library_dataset_dataset_association_1.state AS library_dataset_dataset_association_1_state, library_dataset_dataset_association_1.copied_from_history_dataset_association_id AS library_dataset_dataset_association_1_copied_from_history_1, library_dataset_dataset_association_1.copied_from_library_dataset_dataset_association_id AS library_dataset_dataset_association_1_copied_from_library_2, library_dataset_dataset_association_1.name AS library_dataset_dataset_association_1_name, library_dataset_dataset_association_1.info AS library_dataset_dataset_association_1_info, library_dataset_dataset_association_1.blurb AS library_dataset_dataset_association_1_blurb, library_dataset_dataset_association_1.peek AS library_dataset_dataset_association_1_peek, library_dataset_dataset_association_1.extension AS library_dataset_dataset_association_1_extension, library_dataset_dataset_association_1.metadata AS library_dataset_dataset_association_1_metadata, library_dataset_dataset_association_1.parent_id AS library_dataset_dataset_association_1_parent_id, library_dataset_dataset_association_1.designation AS library_dataset_dataset_association_1_designation, library_dataset_dataset_association_1.deleted AS library_dataset_dataset_association_1_deleted, library_dataset_dataset_association_1.visible AS library_dataset_dataset_association_1_visible, library_dataset_dataset_association_1.user_id AS library_dataset_dataset_association_1_user_id, library_dataset_dataset_association_1.message AS library_dataset_dataset_association_1_message \nFROM job_to_input_library_dataset LEFT OUTER JOIN library_dataset_dataset_association AS library_dataset_dataset_association_1 ON library_dataset_dataset_association_1.id = job_to_input_library_dataset.ldda_id \nWHERE %(param_1)s = job_to_input_library_dataset.job_id' {'param_1': 4404}
149.155.218.28 - - [12/Jul/2011:11:47:31 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-us) AppleWebKit/533.21.1 (KHTML, like Gecko) Version/5.0.5 Safari/533.21.1"
149.155.221.116 - - [12/Jul/2011:11:47:33 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-GB; rv:1.9.2.19) Gecko/20110707 Firefox/3.6.19"
149.155.221.194 - - [12/Jul/2011:11:47:33 +0100] "GET / HTTP/1.1" 200 - "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.221.194 - - [12/Jul/2011:11:47:33 +0100] "GET /history HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.221.194 - - [12/Jul/2011:11:47:33 +0100] "GET /root/tool_menu HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.218.28 - - [12/Jul/2011:11:47:35 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-us) AppleWebKit/533.21.1 (KHTML, like Gecko) Version/5.0.5 Safari/533.21.1"
149.155.221.116 - - [12/Jul/2011:11:47:37 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-GB; rv:1.9.2.19) Gecko/20110707 Firefox/3.6.19"
149.155.221.194 - - [12/Jul/2011:11:47:36 +0100] "GET /tool_runner/rerun?id=6014 HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.221.194 - - [12/Jul/2011:11:47:38 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.218.28 - - [12/Jul/2011:11:47:39 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-us) AppleWebKit/533.21.1 (KHTML, like Gecko) Version/5.0.5 Safari/533.21.1"
149.155.221.194 - - [12/Jul/2011:11:47:39 +0100] "POST /tool_runner/index HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/tool_runner/rerun?id=6014" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
sqlalchemy.pool.QueuePool.0x...b590 WARNING 2011-07-12 11:47:40,371 Error closing cursor: current transaction is aborted, commands ignored until end of transaction block
galaxy.jobs ERROR 2011-07-12 11:47:40,372 failure running job 4405
Traceback (most recent call last):
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 187, in __monitor_step
job_state = self.__check_if_ready_to_run( job )
File "/home/home/galaxy/software/galaxy-ceneral/lib/galaxy/jobs/__init__.py", line 232, in __check_if_ready_to_run
for dataset_assoc in job.input_datasets + job.input_library_datasets:
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 158, in __get__
return self.impl.get(instance_state(instance), instance_dict(instance))
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/attributes.py", line 377, in get
value = callable_()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/strategies.py", line 587, in __call__
result = q.all()
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1267, in all
return list(self)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1361, in __iter__
return self._execute_and_instances(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/query.py", line 1364, in _execute_and_instances
result = self.session.execute(querycontext.statement, params=self._params, mapper=self._mapper_zero_or_none())
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/orm/session.py", line 755, in execute
clause, params or {})
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 824, in execute
return Connection.executors[c](self, object, multiparams, params)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 874, in _execute_clauseelement
return self.__execute_context(context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 896, in __execute_context
self._cursor_execute(context.cursor, context.statement, context.parameters[0], context=context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 950, in _cursor_execute
self._handle_dbapi_exception(e, statement, parameters, cursor, context)
File "/home/home/galaxy/software/galaxy-ceneral/eggs/SQLAlchemy-0.5.6_dev_r6498-py2.6.egg/sqlalchemy/engine/base.py", line 931, in _handle_dbapi_exception
raise exc.DBAPIError.instance(statement, parameters, e, connection_invalidated=is_disconnect)
ProgrammingError: (ProgrammingError) relation "job_to_input_library_dataset" does not exist
'SELECT job_to_input_library_dataset.id AS job_to_input_library_dataset_id, job_to_input_library_dataset.job_id AS job_to_input_library_dataset_job_id, job_to_input_library_dataset.ldda_id AS job_to_input_library_dataset_ldda_id, job_to_input_library_dataset.name AS job_to_input_library_dataset_name, library_dataset_dataset_association_1.id AS library_dataset_dataset_association_1_id, library_dataset_dataset_association_1.library_dataset_id AS library_dataset_dataset_association_1_library_dataset_id, library_dataset_dataset_association_1.dataset_id AS library_dataset_dataset_association_1_dataset_id, library_dataset_dataset_association_1.create_time AS library_dataset_dataset_association_1_create_time, library_dataset_dataset_association_1.update_time AS library_dataset_dataset_association_1_update_time, library_dataset_dataset_association_1.state AS library_dataset_dataset_association_1_state, library_dataset_dataset_association_1.copied_from_history_dataset_association_id AS library_dataset_dataset_association_1_copied_from_history_1, library_dataset_dataset_association_1.copied_from_library_dataset_dataset_association_id AS library_dataset_dataset_association_1_copied_from_library_2, library_dataset_dataset_association_1.name AS library_dataset_dataset_association_1_name, library_dataset_dataset_association_1.info AS library_dataset_dataset_association_1_info, library_dataset_dataset_association_1.blurb AS library_dataset_dataset_association_1_blurb, library_dataset_dataset_association_1.peek AS library_dataset_dataset_association_1_peek, library_dataset_dataset_association_1.extension AS library_dataset_dataset_association_1_extension, library_dataset_dataset_association_1.metadata AS library_dataset_dataset_association_1_metadata, library_dataset_dataset_association_1.parent_id AS library_dataset_dataset_association_1_parent_id, library_dataset_dataset_association_1.designation AS library_dataset_dataset_association_1_designation, library_dataset_dataset_association_1.deleted AS library_dataset_dataset_association_1_deleted, library_dataset_dataset_association_1.visible AS library_dataset_dataset_association_1_visible, library_dataset_dataset_association_1.user_id AS library_dataset_dataset_association_1_user_id, library_dataset_dataset_association_1.message AS library_dataset_dataset_association_1_message \nFROM job_to_input_library_dataset LEFT OUTER JOIN library_dataset_dataset_association AS library_dataset_dataset_association_1 ON library_dataset_dataset_association_1.id = job_to_input_library_dataset.ldda_id \nWHERE %(param_1)s = job_to_input_library_dataset.job_id' {'param_1': 4405}
149.155.221.194 - - [12/Jul/2011:11:47:40 +0100] "GET /user/get_most_recently_used_tool_async HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/root/tool_menu" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.221.194 - - [12/Jul/2011:11:47:40 +0100] "GET /history HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/tool_runner/index" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.221.116 - - [12/Jul/2011:11:47:41 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-GB; rv:1.9.2.19) Gecko/20110707 Firefox/3.6.19"
149.155.221.194 - - [12/Jul/2011:11:47:42 +0100] "GET /root/delete_async?id=6014 HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:5.0) Gecko/20100101 Firefox/5.0"
149.155.218.28 - - [12/Jul/2011:11:47:43 +0100] "POST /root/history_item_updates HTTP/1.1" 200 - "http://galaxy.tsl.ac.uk/history" "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_8; en-us) AppleWebKit/533.21.1 (KHTML, like Gecko) Version/5.0.5 Safari/533.21.1"
10 years, 11 months