We sometimes get out of memory errors in the translator on the Grid, apparently
even with a 3 GB limit, even though the same task takes less than 2 GB on
alfons. Might be a Python version difference. An example of this is Scanalyzer #28.
Once we can measure memory usage in the translator (see issue209), we should
maybe look into reducing it. Two ideas that might save a lot:
* in translate.py, remove the untranslated versions of the operators once
they have been processed
* if the same atom/literal is used in multiple places, don't use a copy,
but reuse the same object.
|