Message5835

Author malte
Recipients andrew.coles, erez, florian, jendrik, malte, silvia
Date 2016-11-29.14:14:50
Content
It's a pity that the data isn't better. It's hard to justify moving to 64 bits
with such a large increase in memory usage (and loss in coverage). I think with
these numbers, we should do something about memory usage.

I suggest we focus on blind search first. There may be other memory-wasters
hidden inside certain heuristics etc., but whatever is making blind search use
much more memory will affect all configurations that expand/evaluate/generate
many states. 

Of course we should measure things before proceeding, but my best guess at the
moment is that the hash table (unordered_set) inside StateRegistry is to blame
since I cannot think of any other major data structures for which it is
plausible that they require much more memory in a 64-bit compile.

I would suggest that we
1) do a memory profile on some representative blind search test cases, in 32-bit
mode and 64-bit mode
2) look more closely at the main memory users and their underlying
implementation to see how they differ in 32-bit and 64-bit mode
3) think of more memory-efficient implementations

Perhaps the answer is simply to use another ready-made hash table
implementation, but I think it may be worthwhile to understand this more deeply
before we design a solution.
History
Date User Action Args
2016-11-29 14:14:50maltesetmessageid: <1480425290.31.0.699408425043.issue213@unibas.ch>
2016-11-29 14:14:50maltesetrecipients: + malte, erez, andrew.coles, silvia, jendrik, florian
2016-11-29 14:14:50maltelinkissue213 messages
2016-11-29 14:14:50maltecreate