-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Estimate RAM usage based on input filesize #44
Comments
Hi Matthias, For elPrep 4, we made a predictor for peak RAM use based on a set of benchmark runs. More specifically, we made such a predictor for WGS data for the elPrep filter mode. This gave use the following equation for predicting the RAM use based on input BAM size: Y = 15X + 32. This means ePrep 4 requires about 32 GB base memory + 15 times the input BAM size (in GB) for the elPrep filter mode, in the case of WGS data. For estimating the memory use for the sfm mode, you would need to look at the BAM size of the largest split file, which can vary for different data sets. The numbers would look a bit different for WES data. We would also need to update the predictor for elPrep 5. Does this help? Would it be useful to update a specific predictor for your use case? Thanks! |
Hi Charlotte, Thanks! I ran the numbers and we're getting a bit different results. For an exome of about 8GB we see a RAM usage of about 300GB on average ( 3 tests, with 20, 40 and 80 threads). Anecdotally, the more threads we used, the lower the ram usage was (about 30gb difference between 20 and 80 threads). An updated predictor would be most welcome! cheers NB, command used was elprep filter \
$1 \
${1%-sort.bam}.bam \
--nr-of-threads 20 \
--mark-duplicates \
--mark-optical-duplicates ${1%-sort.bam}_duplicate_metrics.txt \
--optical-duplicates-pixel-distance 2500 \
--sorting-order coordinate \
--haplotypecaller ${1%-sort.bam}.vcf.gz \
--reference /references/Hsapiens/hg38/seq/hg38.elfasta \
--target-regions /references/Hsapiens/hg38/coverage/capture_regions/CMGG_WES_analysis_ROI_v2.bed \
--log-path $PWD --timed
`` |
Hi Matthias, I have made a preliminary predictor for elPrep 5 based on benchmarks for data samples we have at our lab: Y = 24X + 3. This, however, is quite far from the numbers you saw in your runs. I have a couple of questions:
Thanks a lot! Best, |
Hi Charlotte,
I'll keep you posted! Matthias |
This comment has been minimized.
This comment has been minimized.
A quick test with BQSR on 80 threads reduces RAM usage by about 20GB (270GB total), so you were right about it having an effect on the requirements! edit: removed off topic remark |
@matthdsm I opened new issues for your two side notes. I hope you have been notified by my answers there. Thanks, |
Duly noted! |
Hi Charlotte, Are there any updates wrt to the RAM usage estimate? Thanks |
Hi Matthias, We had a last e-mail exchange to get access to a data file on March 17th. As far as I know, there was never a reply? Thanks! |
Right, I lost sight on what had already been done. Let me get back to you! M |
Hi Charlotte, To get back to this, which compression level do you use for your test input data? That might be the reason your formula doesn't work for our data. Since the input bam is intermediate data, we only use fast compression (e.g. On a related note, what compression level do you use for the output bams? I noticed the output bam is larger than the input, which usually isn't the case when the data is sorted. |
I ran some tests on our infrastructure and came up with |
Hi,
I'm trying to find a way to get a rough estimate of how much ram I'll need to run elprep filter based on the size of the input bam.
Do you have any way of calculating this, e.g. for when submitting a job to some cloud provider?
Thanks
Matthias
The text was updated successfully, but these errors were encountered: