This is not caused by an infinite loop but by a combinatorial explosion when computing negated axioms. Previously this happened during translation (and thus for any search configuration), but issue454 moved the code to the search component and it is only computed for heuristics that need negated axioms. Unfortunately for lama, both ff and landmark_sum need them.
You can avoid the expensive computation by passing the option "axioms=approximate_negative" to both heuristics. This will instead use a trivial overapproximation for negated axioms (basically, the default value of a derived variable can always be achieved for free). While this might make the heuristic weaker, it is certainly better than never starting the search at all :).
I mark this issue as solved, but feel free to change it again if you disagree.
|