Currently, the documentation of iterated search contains (among others) the following two notes:
Note 1: We don't cache heuristic values between search iterations at the moment. If you perform a LAMA-style iterative search, heuristic values will be computed multiple times.
Note 3: If you reuse the same landmark count heuristic (using heuristic predefinition) between iterations, the path data (that is, landmark status for each visited state) will be saved between iterations.
When working on issue1130, Malte pointed out that they seem contradictory. He further made the following comments:
> The heuristic cache is PerStateInformation, the path data is a PerStateBitset. They live in slightly different places, but not in a way that it should matter.
> As far as I recall, we use different state registries for every search and hence don't use per-state information of the previous state registries. So I think Note 1 is correct and Note 3 is wrong. But I guess someone should double-check this.
> BTW, I'm not sure we ever release the data in the state registries of the finished searches, which would imply a huge waste of memory in iterated search (for the state registries and per-state information). But perhaps we do release them and I forgot.
So we should investigate how this is really implemented, change it if we are not happy, and adapt the documentation accordingly.
|