You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running launch_CRAM2VCF_C++.pl I have noticed that running 10 processes in parallel while(scalar(keys %still_running) >= 10)will use up a lot of mermory. Getting a job killed with 1024 Gbyte of RAM.
So I had to change it to while(scalar(keys %still_running) >= 2) in code.
IMHO it would be a good to make this an adjustable parameter.
it might be a good idea to look at why CRAM2VCF.cpp needs so much memory and if this can be changed
The text was updated successfully, but these errors were encountered:
When running launch_CRAM2VCF_C++.pl I have noticed that running 10 processes in parallel while(scalar(keys %still_running) >= 10)will use up a lot of mermory. Getting a job killed with 1024 Gbyte of RAM.
So I had to change it to while(scalar(keys %still_running) >= 2) in code.
There's certainly no reason why this couldn't be a parameter in the script.
it might be a good idea to look at why CRAM2VCF.cpp needs so much memory and if this can be changed
This isn't too surprising, if I recall. I agree though---the question to my mind is how easily things could be improved in terms of , and if it's worth the investment. (Not the most helpful comment I know, but I'll have to take a closer look at the *cpp code).
When running
launch_CRAM2VCF_C++.pl
I have noticed that running 10 processes in parallelwhile(scalar(keys %still_running) >= 10)
will use up a lot of mermory. Getting a job killed with 1024 Gbyte of RAM.So I had to change it to
while(scalar(keys %still_running) >= 2)
in code.The text was updated successfully, but these errors were encountered: