I'm confused if files are uploaded through the Galaxy server or through Apache?
I already set up the Apache proxy explained here: http://wiki.galaxyproject.org/Admin/Config/Apache%20Proxy (Does that affect uploads?)
My goal is to let users upload as large of files as possible through HTTP.
So far I just tested and upload of a 140MB file and I got this error after a couple of minutes:
1: dsim-all-chromosome-r1.3.fasta.msg.updated.fasta 0 bytes An error occurred running this job:Traceback (most recent call last): File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 384, in <module> __main__() File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 359, in __main__ registry.load_datat
I see this information in the bug report:
Traceback (most recent call last): File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 384, in <module> __main__() File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 359, in __main__ registry.load_datatypes( root_dir=sys.argv[1], config=sys.argv[2] ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/datatypes/registry.py", line 65, in load_datatypes tree = galaxy.util.parse_xml( config ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/util/__init__.py", line 143, in parse_xml tree = ElementTree.parse(fname) File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 859, in parse File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 576, in parse IOError: [Errno 2] No such file or directory: '/scratch/galaxy/tmptzuSYJ' Traceback (most recent call last): File "./scripts/set_metadata.py", line 118, in <module> __main__() File "./scripts/set_metadata.py", line 68, in __main__ datatypes_registry.load_datatypes( root_dir=config_root, config=datatypes_config ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/datatypes/registry.py", line 65, in load_datatypes tree = galaxy.util.parse_xml( config ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/util/__init__.py", line 143, in parse_xml tree = ElementTree.parse(fname) File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 859, in parse File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 576, in parse IOError: [Errno 2] No such file or directory: '/scratch/galaxy/tmptzuSYJ'
Also here is what /scratch/galaxy looks like. It seems like the file it mentions does exist??
ls -lh /scratch/galaxy/
total 134M -rw-r--r-- 1 galaxy scicomp 20K Nov 13 13:35 tmpfCaiGX -rw-r--r-- 1 galaxy scicomp 20K Nov 14 13:36 tmpM2Wclq -rw-r--r-- 1 galaxy scicomp 20K Nov 14 13:35 tmpsIyPur -rw-r--r-- 1 galaxy scicomp 20K Nov 16 08:48 tmptY5RaM -rw-r--r-- 1 galaxy scicomp 20K Nov 16 08:54 tmptzuSYJ -rw-r--r-- 1 galaxy scicomp 20K Nov 15 15:47 tmpu77PEK -rw------- 1 galaxy scicomp 286 Nov 16 08:58 tmpykoBIq -rw------- 1 galaxy scicomp 134M Nov 16 08:58 upload_file_data_G3eZEg
Thanks,
Greg
On Nov 16, 2012, at 9:14 AM, greg wrote:
I'm confused if files are uploaded through the Galaxy server or through Apache?
I already set up the Apache proxy explained here: http://wiki.galaxyproject.org/Admin/Config/Apache%20Proxy (Does that affect uploads?)
My goal is to let users upload as large of files as possible through HTTP.
The file is uploaded "through" Apache, but Apache doesn't do anything special with it other than stream the bits through to Galaxy. With nginx it's possible to have the proxy server actually handle the upload itself.
So far I just tested and upload of a 140MB file and I got this error after a couple of minutes:
1: dsim-all-chromosome-r1.3.fasta.msg.updated.fasta 0 bytes An error occurred running this job:Traceback (most recent call last): File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 384, in <module> __main__() File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 359, in __main__ registry.load_datat
I see this information in the bug report:
Traceback (most recent call last): File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 384, in <module> __main__() File "/misc/local/galaxy/galaxy-dist/tools/data_source/upload.py", line 359, in __main__ registry.load_datatypes( root_dir=sys.argv[1], config=sys.argv[2] ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/datatypes/registry.py", line 65, in load_datatypes tree = galaxy.util.parse_xml( config ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/util/__init__.py", line 143, in parse_xml tree = ElementTree.parse(fname) File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 859, in parse File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 576, in parse IOError: [Errno 2] No such file or directory: '/scratch/galaxy/tmptzuSYJ' Traceback (most recent call last): File "./scripts/set_metadata.py", line 118, in <module> __main__() File "./scripts/set_metadata.py", line 68, in __main__ datatypes_registry.load_datatypes( root_dir=config_root, config=datatypes_config ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/datatypes/registry.py", line 65, in load_datatypes tree = galaxy.util.parse_xml( config ) File "/misc/local/galaxy/galaxy-dist/lib/galaxy/util/__init__.py", line 143, in parse_xml tree = ElementTree.parse(fname) File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 859, in parse File "build/bdist.linux-x86_64/egg/elementtree/ElementTree.py", line 576, in parse IOError: [Errno 2] No such file or directory: '/scratch/galaxy/tmptzuSYJ'
Also here is what /scratch/galaxy looks like. It seems like the file it mentions does exist??
ls -lh /scratch/galaxy/
total 134M -rw-r--r-- 1 galaxy scicomp 20K Nov 13 13:35 tmpfCaiGX -rw-r--r-- 1 galaxy scicomp 20K Nov 14 13:36 tmpM2Wclq -rw-r--r-- 1 galaxy scicomp 20K Nov 14 13:35 tmpsIyPur -rw-r--r-- 1 galaxy scicomp 20K Nov 16 08:48 tmptY5RaM -rw-r--r-- 1 galaxy scicomp 20K Nov 16 08:54 tmptzuSYJ -rw-r--r-- 1 galaxy scicomp 20K Nov 15 15:47 tmpu77PEK -rw------- 1 galaxy scicomp 286 Nov 16 08:58 tmpykoBIq -rw------- 1 galaxy scicomp 134M Nov 16 08:58 upload_file_data_G3eZEg
Are you running the upload tool on a cluster?
--nate
Thanks,
Greg ___________________________________________________________ The Galaxy User list should be used for the discussion of Galaxy analysis and other features on the public server at usegalaxy.org. Please keep all replies on the list by using "reply all" in your mail client. For discussion of local Galaxy instances and the Galaxy source code, please use the Galaxy Development list:
http://lists.bx.psu.edu/listinfo/galaxy-dev
To manage your subscriptions to this and other Galaxy lists, please use the interface at:
On Fri, Nov 16, 2012 at 10:00 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 9:14 AM, greg wrote:
Are you running the upload tool on a cluster?
--nate
Well I have Galaxy set up to send it's jobs to SGE/qsub but I haven't verified that it's doing that yet. I wanted to upload a file to test on first.
Here are the changes I made to universe_wsgi.ini after copying the sample:
port = 8080 host = 0.0.0.0 database_connection = mysql://user:password@devel10-db/galaxy database_engine_option_pool_size = 10 database_engine_option_max_overflow = 15 database_engine_option_pool_recycle = 7200 database_engine_option_strategy = threadlocal file_path = /home/galaxy new_file_path = /scratch/galaxy job_working_directory = /home/galaxy/job_working_directory cluster_files_directory = /home/galaxy/pbs log_level = INFO log_events = True log_actions = True use_remote_user = True remote_user_maildomain = mydomain.org require_login = True allow_user_creation = True allow_user_deletion = False allow_user_impersonation = True allow_user_dataset_purge = True new_user_dataset_access_role_default_private = True job_manager = main job_handlers = main default_job_handlers = main track_jobs_in_database = True enable_job_recovery = True set_metadata_externally = True cleanup_job = always start_job_runners = drmaa environment_setup_file = /usr/local/galaxy/job_environment_setup_file default_cluster_job_runner = drmaa://-V/ cluster_job_queue_workers = 3 #biomart = local:/// #encode_db1 = local:/// #hbvar = local:/// #microbial_import1 = local:/// #ucsc_table_direct1 = local:/// #ucsc_table_direct_archaea1 = local:/// #ucsc_table_direct_test1 = local:/// #upload1 = local:///
-Greg
On Nov 16, 2012, at 10:41 AM, greg wrote:
On Fri, Nov 16, 2012 at 10:00 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 9:14 AM, greg wrote:
Are you running the upload tool on a cluster?
--nate
Well I have Galaxy set up to send it's jobs to SGE/qsub but I haven't verified that it's doing that yet. I wanted to upload a file to test on first.
Here are the changes I made to universe_wsgi.ini after copying the sample:
#upload1 = local:///
I'm guessing that it's working, you can verify by having a look in your Galaxy server output/log file. new_file_path will need to be set to something that is accessible on the cluster.
--nate
-Greg
I guess I'm still confused about new_file_path.
The upload tool uses that directory to store files temporariliy until the upload completes? And then moves it? (Will it clean up anything placed in the new_file_path directory?)
Thanks,
Greg
On Fri, Nov 16, 2012 at 10:56 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 10:41 AM, greg wrote:
On Fri, Nov 16, 2012 at 10:00 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 9:14 AM, greg wrote:
Are you running the upload tool on a cluster?
--nate
Well I have Galaxy set up to send it's jobs to SGE/qsub but I haven't verified that it's doing that yet. I wanted to upload a file to test on first.
Here are the changes I made to universe_wsgi.ini after copying the sample:
#upload1 = local:///
I'm guessing that it's working, you can verify by having a look in your Galaxy server output/log file. new_file_path will need to be set to something that is accessible on the cluster.
--nate
-Greg
Also what would I look for in the log file to know if it's issuing jobs?
I started a new upload, but I don't see anything from the galaxy user when I run qstat.
Thanks again.
Greg
On Fri, Nov 16, 2012 at 12:22 PM, greg margeemail@gmail.com wrote:
I guess I'm still confused about new_file_path.
The upload tool uses that directory to store files temporariliy until the upload completes? And then moves it? (Will it clean up anything placed in the new_file_path directory?)
Thanks,
Greg
On Fri, Nov 16, 2012 at 10:56 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 10:41 AM, greg wrote:
On Fri, Nov 16, 2012 at 10:00 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 9:14 AM, greg wrote:
Are you running the upload tool on a cluster?
--nate
Well I have Galaxy set up to send it's jobs to SGE/qsub but I haven't verified that it's doing that yet. I wanted to upload a file to test on first.
Here are the changes I made to universe_wsgi.ini after copying the sample:
#upload1 = local:///
I'm guessing that it's working, you can verify by having a look in your Galaxy server output/log file. new_file_path will need to be set to something that is accessible on the cluster.
--nate
-Greg
On Nov 16, 2012, at 12:24 PM, greg wrote:
Also what would I look for in the log file to know if it's issuing jobs?
I started a new upload, but I don't see anything from the galaxy user when I run qstat.
If you follow the log, you should see messages from galaxy.jobs, and specifically, galaxy.jobs.runners.drmaa.
--nate
Thanks again.
Greg
On Fri, Nov 16, 2012 at 12:22 PM, greg margeemail@gmail.com wrote:
I guess I'm still confused about new_file_path.
The upload tool uses that directory to store files temporariliy until the upload completes? And then moves it? (Will it clean up anything placed in the new_file_path directory?)
Thanks,
Greg
On Fri, Nov 16, 2012 at 10:56 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 10:41 AM, greg wrote:
On Fri, Nov 16, 2012 at 10:00 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 9:14 AM, greg wrote:
Are you running the upload tool on a cluster?
--nate
Well I have Galaxy set up to send it's jobs to SGE/qsub but I haven't verified that it's doing that yet. I wanted to upload a file to test on first.
Here are the changes I made to universe_wsgi.ini after copying the sample:
#upload1 = local:///
I'm guessing that it's working, you can verify by having a look in your Galaxy server output/log file. new_file_path will need to be set to something that is accessible on the cluster.
--nate
-Greg
On Fri, Nov 16, 2012 at 12:28 PM, Nate Coraor nate@bx.psu.edu wrote:
If you follow the log, you should see messages from galaxy.jobs, and specifically, galaxy.jobs.runners.drmaa.
No, I don't see anything like that in the log after I started the upload. Any ideas on how to see what might be going wrong there?
-Greg
On Nov 16, 2012, at 12:22 PM, greg wrote:
I guess I'm still confused about new_file_path.
The upload tool uses that directory to store files temporariliy until the upload completes? And then moves it? (Will it clean up anything placed in the new_file_path directory?)
Yes, the parameters of the upload tool are written to a temp file in this directory. It is cleaned up upon completion.
A lot of other things use new_file_path, and not all of them clean themselves up, so it's a good idea to use an automated process to clean the directory.
--nate
Thanks,
Greg
On Fri, Nov 16, 2012 at 10:56 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 10:41 AM, greg wrote:
On Fri, Nov 16, 2012 at 10:00 AM, Nate Coraor nate@bx.psu.edu wrote:
On Nov 16, 2012, at 9:14 AM, greg wrote:
Are you running the upload tool on a cluster?
--nate
Well I have Galaxy set up to send it's jobs to SGE/qsub but I haven't verified that it's doing that yet. I wanted to upload a file to test on first.
Here are the changes I made to universe_wsgi.ini after copying the sample:
#upload1 = local:///
I'm guessing that it's working, you can verify by having a look in your Galaxy server output/log file. new_file_path will need to be set to something that is accessible on the cluster.
--nate
-Greg
galaxy-user@lists.galaxyproject.org