Hi Dannon, I am still just trying to proof of concept the workflow_execute script and running into a little trouble. I am tracking my issue into workflows.py in the api. In debugging, I am confident that the datasets are correctly specified, as I can specify a bad dataset and get the http workflow interface to return an error. However, the input dataset that I am specifying is not being forwarded to the workflow step. When I execute the workflow from the web ui, I get a command like this for the first step: executing: python /mnt/galaxy/galaxy_dist/tools/fastq/fastq_groomer.py '/mnt/galaxy/galaxy_dist/database/files/000/dataset_48.dat' 'illumina' '/mnt/galaxy/galaxy_dist/database/job_working_directory/169/galaxy_dataset_169.dat' 'sanger' 'ascii' 'summarize_input' However, when I run a workflow from the command line: python /mnt/galaxy/galaxy_dist/scripts/api/workflow_execute.py <api-key> <url>/api/workflows 38247d270c7cb1bb 'hist_id=38247d270c7cb1bb' '1=hda=30fc17ce78176bfb' this is the command I get for the first step: executing: python /mnt/galaxy/galaxy_dist/tools/fastq/fastq_groomer.py 'None' 'illumina' '/mnt/galaxy/galaxy_dist/database/job_working_directory/164/galaxy_dataset_164.dat' 'sanger' 'ascii' 'summarize_input' So the dataset passes all of the asserts, but is not passed to the first step of the workflow. The api key and workflow id are working because it's not failing right away, and the history id is working because the failed steps are populated in it. The dataset seems not to be throwing any errors. So I am finding myself very confused! Is there something obvious that you see? Take care, Darren On Tue, Mar 15, 2011 at 7:50 PM, Darren Brown <brown@centerspace.net> wrote:
Hi Dannon,
Thanks very much for answering these questions. I think this all I need to get it going. I'll report back soon.
Darren
On Mar 15, 2011 6:06 PM, "Dannon Baker" <dannonbaker@me.com> wrote: