Recently I tried to upload some annotation/reference files into Galaxy. After couple failed tries using the upload method, I was able to do it via the using the full file path. However, after these tries/retries, I believe Galaxy database got confused and was corrupted. Existing workflows with existing reference were used when datasets that previously were processed without any problems, now have funky result set names and parameters to tool wrappers reference funky paths. Here’s an example of the error I’m encountering. The highlighted part references the dbSNP file: python /home/svcbioinfo/galaxy/galaxy-reproduce/tools/gatkV2/gatk_wrapper.py --max_jvm_heap "16g" --max_jvm_heap_fraction "1" --stdout "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/024/dataset_24218.dat" -d "-I" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/024/dataset_24216.dat" "bam" "gatk_input_first" -d "" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/_metadata_files/001/metadata_1491.dat" "bam_index" "gatk_input_first" -p 'java -jar "/home/svcbioinfo/galaxy/galaxy-reproduce/tool-data/shared/jars/gatk/GenomeAnalysisTKv2.jar" -T "RealignerTargetCreator" --num_threads 4 -o "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/024/dataset_24217.dat" -et "NO_ET" -K "/home/svcbioinfo/galaxy/galaxy-reproduce/tool-data/shared/jars/genomichealth.com.key" -R "/mnt/ngs/analysis/refData/processed/bwa/hg19_decoy_rcrs8001sp.fasta" ' -d "-known:dbsnp,%(file_type)s" "/mnt/ngs/analysis/tchu/Workflow_RWC/SEQUENCER99/140201_SNTEST_0021_TESTFC21/analysis_31/primary/fastq/99_21_3_ACAGTG_R2.fastq" "fastq" "input_dbsnp_0" -d "-known:indels,%(file_type)s" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/016/dataset_16596.dat" "vcf" "input_indels_0" -p '--pedigreeValidationType "STRICT"' -d "--intervals" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/022/dataset_22821.dat" "bed" "input_intervals_0" -p '--interval_padding "400"' -p '--interval_set_rule "UNION"' -p '--downsampling_type "NONE"' -p ' --validation_strictness "SILENT" --interval_merging "ALL" ' -p ' --unsafe "LENIENT_VCF_PROCESSING" The highlighted part should be the dbSNP file which typically has a Galaxy based name and type of “vcf” such as below: python /home/svcbioinfo/galaxy/galaxy-reproduce/tools/gatkV2/gatk_wrapper.py --max_jvm_heap "16g" --max_jvm_heap_fraction "1" --stdout "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/024/dataset_24314.dat" -d "-I" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/024/dataset_24312.dat" "bam" "gatk_input_first" -d "" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/_metadata_files/001/metadata_1492.dat" "bam_index" "gatk_input_first" -p 'java -jar "/home/svcbioinfo/galaxy/galaxy-reproduce/tool-data/shared/jars/gatk/GenomeAnalysisTKv2.jar" -T "RealignerTargetCreator" --num_threads 4 -o "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/024/dataset_24313.dat" -et "NO_ET" -K "/home/svcbioinfo/galaxy/galaxy-reproduce/tool-data/shared/jars/genomichealth.com.key" -R "/mnt/ngs/analysis/refData/processed/bwa/hg19_decoy_rcrs.fasta" ' -d "-known:dbsnp,%(file_type)s" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/016/dataset_16598.dat" "vcf" "input_dbsnp_0" -d "-known:indels,%(file_type)s" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/016/dataset_16596.dat" "vcf" "input_indels_0" -p '--pedigreeValidationType "STRICT"' -d "--intervals" "/mnt/ngs/analysis/svcbioinfo/DATA_reproduce/files/016/dataset_16617.dat" "bed" "input_intervals_0" -p '--interval_padding "400"' -p '--interval_set_rule "UNION"' -p '--downsampling_type "NONE"' -p ' --validation_strictness "SILENT" --interval_merging "ALL" ' -p ' --unsafe "LENIENT_VCF_PROCESSING" I wonder if there's a way to recover from this or I'll have to reinstall the database. Thanks for your help. Thuy-Linh Chu