-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sv-callers Installation in local machine and input files #47
Comments
Hi,
First, have you tried to run it locally?
The workflow takes care of the dependencies including SV callers etc. via (bio)conda.
In principle, yes (see here) but the unit/CI tests run with the aforementioned (older) software versions (see #35).
Currently, there is no support for CRAM (sorry, we've been working with BAMs only).
You can configure a.o. things here sv-callers/snakemake/analysis.yaml Line 18 in 32dca9e
Cheers, |
Thank you for the reply. I will try to install sv-callers in local machine. |
But I noticed the following lines in the output vcf files. Further how to execute these sv tools such as manta,delly,lumpy and gridss by using sv-caller in order to get structural variants genotype results (germline.vcf) in our local machine. Could you please direct me the command documentation. all.vcf delly.vcf Thanks. |
As per your suggestion we had tried the command snakemake -C echo_run=0 I had some doubts:
In this system we had already installed "delly" so I think sv-callers is able invoke only delly., but the other tools are not executing because the other tools are not installed. |
Please, read the README carefully. You must not install the callers and processing tools yourself; it's taken care of by the workflow if you add the missing |
Right now we are working in the local machine Centos7, so we tried the bellow command and it is running for more than 45min but still showing same message. Could you please point out whether this command is correct, if so how long it takes to run.
Thanks for your support. |
The command looks fine. Yeah, conda install use to take a few minutes but these days it's very slow indeed - something to consider for the next release (#49) - but it needs to be done just once before the actual workflow run(s). What's your |
I am running from root user. The conda version is
And the conda package install is STILL in same stage. |
Yes, that's clear but it's not necessary (and could be dangerous).
Update to the latest version via
Sorry, I can't help you with that (e.g. waiting, Internet bandwidth etc.) |
the command had run but I wonder why in result data not able to find any SV call information. pasting the ``all.vcf``` information. (sorry I could not drop the all.vcf file)
Thank you so much. |
That's correct. The sample data are meant for CI testing only (T3/N3.bam files are identical and refer to a small part of the genome). The
You could use VCF files of each caller in the corresponding dir or the aforementioned (merged) VCF.
In principle, there is no limit on the number of samples in
It depends on your samples and machine. See our paper for example runs (germline and somatic).
The workflow takes care of the parallelization so there is no need to split/merge jobs yourself.
See |
Yesterday when i tried to fired the job, the conda package was installed within a miniutue. Thanks |
Hi,
I would like to use sv-callers for calling germlineSVs from WGS. I have following doubts:
Is this tool can be installed in local centos7 machine? And whether the sv callers like manta, delly, lumpy and GRIDSS and other tools bcftools, survivor should be installed separately or it is the part of this repository (sub-modules already build in sv-callers)?
The version of manta is 1.1.0, whether the current version of sv-callers supports updated version of manta or GRIDSS?
Second thing is whether "cram" can be used as a input?
Samples are aligned to GRCh38 reference, does sv-callers provide excluded regions in .bed file of reference genome?
Sorry for all the questions, let me know if there is a better place to ask them, person to email, etc.
Thanks in advance!
Nitha
The text was updated successfully, but these errors were encountered: